Skip to main content

'Lady And The Dale,' A 4-Part Series About A 3-Wheeled Car, Is A Wild Ride

The Lady and the Dale is a new HBO documentary miniseries co-directed by Nick Cammilleri and Zackary Drucker. It's about a female automobile executive who took on the Detroit automakers and tried to market a gas-efficient car in the 1970s, at the height of the oil crisis.

08:40

Other segments from the episode on January 27, 2021

Fresh Air with Terry Gross, January 27, 2021: Interview with Jon Fasman; Review of 'The Lady and the Dale.'

Transcript

DAVE DAVIES, HOST:

This is FRESH AIR. I'm Dave Davies, in for Terry Gross, who's off this week.

By now, many of us are used to the idea that American intelligence services, such as the National Security Agency, have enormous capacities to track our phone calls, emails and movements, and we hope that rational and constitutional rules for their use are set by our elected leaders. But our guest, journalist Jon Fasman, says most of us don't realize that thousands of police departments across the country also have access to some really powerful surveillance tools with relatively little oversight. There are devices that scan and store the locations of thousands of auto license plates they encounter randomly on the street, portable gadgets called Stingrays that electronically mimic a cellphone tower and get every mobile phone within range to yield its data and cameras - cameras everywhere, increasingly with facial recognition software.

Fasman's new book explores these new technologies, the questions they raise about privacy and the controls he believes citizens should impose on the agencies that use them. His book is "We See It All: Liberty And Justice In An Age Of Perpetual Surveillance."

Jon Fasman is the U.S. digital editor for The Economist and the author of two novels. He joins me from his home in suburban New York.

Jon Fasman, welcome to FRESH AIR. You know, this book raises a lot of skeptical questions about law enforcement. And I'd like to begin by just having you explain what kind of time you spent with police officers and police chiefs and sheriffs and others, observing their practice and getting their point of view.

JON FASMAN: Well, over the course of my reporting, going back to 2010, when I was the Atlanta correspondent for The Economist, I've written a great deal about criminal justice issues. And I've embedded with a number of departments over the past decade.

For this book in particular, I probably spent the most time with the LAPD. I went out with their patrol division, and I spent time with Sean Malinowski, who was then an assistant chief, learning about a predictive policing program they use called PredPol. And we can talk about that later. I also embedded for several days with the Newark Police Department to learn about how technology is used by police officers in their day-to-day jobs.

So one of the challenges in writing about this is that I would hear about this technology from tech companies and from police chiefs from a sort of 30,000-foot view. But I really wanted to see how police officers, as they work, integrate technology into their daily jobs, how it changes the sort of practice of being a police officer.

DAVIES: OK, so let's talk about some of these technologies. One of the things, of course, that people are aware of is video surveillance. I mean, there are, you know, security cameras in so many places that police typically consult after there are crimes in given areas. But you're looking at ways that this is being expanded. And particularly in Newark, N.J., police have something called Citizen Virtual Patrol. You want to explain what this is, how it works?

FASMAN: Sure. The Citizen Virtual Patrol is a network of cameras placed throughout the city. These are public cameras. Now, one of the things when you talk about CCTV cameras, the overwhelming majority of them are privately owned. But these - the Citizen Virtual Patrol is a network composed of publicly owned cameras that people can access from a laptop. Now, the idea behind this was to sort of allow people to observe and perhaps testify to crimes from behind a veil of anonymity. So it gives people an eye on the entire city. It lets people see what's going on.

DAVIES: And, Jon, when you say people, you mean just ordinary citizens. Anybody can dial up and look through these cameras.

FASMAN: So that's right. A citizen anywhere can log in to the Citizen Virtual Patrol of Newark. So if I live about 50 miles north of Newark, and I can log in at my desk and see the feed from any one of the 126 cameras that the Newark Public Safety Department has placed around the city.

DAVIES: That seems a little intrusive, right? I mean, somebody who just wants to be a busybody - I guess other people would say it's no different from looking out a window, right? You're not peeping into somebody's apartment. What concerns does this raise?

FASMAN: That's right. Technically, the cameras don't show anything that an observer on the street couldn't see. So it shows public streets. It's not aimed at anyone's apartment. It's not looking inside anyone's window. On the other hand, it does show people's yards, where they have a slightly higher expectation of privacy, and it could provide some information that people wouldn't want known about them.

For instance, if I'm - you know, let's say I had an ex who lived in Newark, and I was watching the camera trained on her house. If I saw her leave with some suitcases and then saw no activity at her house for a couple of days, I could surmise that she wasn't there. I could also know when she goes out, when she comes home, who comes to her house, all these things that, of course, I could find out if I observed her in front of her house, but that would make me visible. This renders me invisible. And it lets me observe, lets anyone who logs in observe an enormous swath of the city.

So that's another thing I think we need to think about when we're thinking of police technologies, and that is that any single instance may be unobjectionable. But when it comes to scale, you're talking about something very different.

DAVIES: Do the police think that it's been helpful? Have there been any complaints or any notable benefits to, you know, to law enforcement from having all of these windows on the street?

FASMAN: The police do think it's been helpful. The police think it helps people keep an eye on their neighborhood. I think the idea was that a citizen of a certain neighborhood could keep an eye on what was happening around her without sort of making herself known to other people around her as someone who was watching. So it lets her keep an eye on what's happening around her behind that veil of anonymity. And the Newark police think that's good for public safety.

On the other hand, I spoke to the director Amol Sinha, who's the director of the ACLU of New Jersey, who made the same point that I just made, that what we're talking about is, at scale, quite different. And the cameras sort of give information about people and their private activities that you wouldn't necessarily want to know and that if you did try to find out, you would be observed. This lets you do so anonymously and invisibly. And that anonymity and invisibility has, you know, benefits, according to the Newark police, and also detriments, according to the ACLU.

DAVIES: Also in the United States, we're seeing the use of automatic license plate readers, which is - I confess I was not aware of this. Describe what these things are, how big they are, who uses them, what they do.

FASMAN: Well, these things are small cameras that attach to police cars. They're things you really wouldn't see unless you were looking for them. They're sort of little, flat cameras either on the front or on the roof of the police car. And what they do is as they pass, they capture an image of each license plate, and they translate that image into just plain letters and numbers. They log the geospatial data, so where the car was and what time it was observed. And it just goes into a - it goes into a database.

And so, again, the issue with that is one of scale. There is nothing illegal about the police noting the license plate of cars parked in public, where they were parked, when they were parked there. But if you look down your street and you saw a police officer writing down every license plate all day, every day, you might wonder why he was doing that. A police department, if they had to assign people to do that, might wonder whether it was worth it - right? - whether it was worth the manpower to have someone just noting down license plates. But what these ALPRs do is they obviate those decisions. They make it extremely easy and extremely cheap to always be taking pictures of where cars were at any one time.

And unless you live in one of the very few public cities in which you don't need to have a car because public transport is so reliable, then this essentially lets police put together a very granular, sort of detailed roster of where you go and when you go and who you see. And again, this is the sort of thing that they could do. There's nothing illegal about it. But it's a question of what people want from the state. Do they want to have that information on file all the time?

DAVIES: So let me make sure I understand this. So police cars that have these license plate readers drive down the street, and they automatically just pick up every license plate within range and...

FASMAN: Yeah.

DAVIES: ...Feed it into the database. Now, if it finds that there's a stolen car among them, does it flag it or is it simply going into a database?

FASMAN: If it finds that the car is stolen or is associated with a crime, then it does flag it. And that's the justification that police departments give. And it's a good justification. I'm not suggesting and writing about the dangers of ALPRs that we eliminate them because it does help find stolen cars. It does help find cars used in crimes. The issue is, what about the 99.99% of cars that aren't involved in crimes? What do you do with their data? So states have wildly different laws on how long that data can be kept. In New Hampshire, for instance, if the car is not associated with any crime or is not being looked for, then you've got to delete it within three minutes. There are other states that set 24-hour limits.

But there are a lot of states that set no limits at all. And they just throw these pictures into a huge database. Often, those databases are poorly secured. So in 2015, there was a journalist who just stumbled onto the Boston Police Department's entire database of ALPR data. It's that sort of thing - the collection at scale, the lack of regulations over how long the information is kept and, often, the lack of security over how it's kept that combine to make these sorts of technologies really worrying.

DAVIES: So I can imagine abuse. I mean, if I were stalking someone, you know, someone I'd had a relationship with or someone who's spurned my advances and I get access to this, I can see where they go, where they are all the time, what kind of people they frequent. I could also see police saying, well, after, you know, there's been a terrible crime committed and we've identified a suspect, if they're associated with a license plate, we can look through this huge, searchable mass of data and see, aha, this is where their friends are. Aha, that's where they might just be. Is it used in these ways?

FASMAN: That is how it's often used. And you're right to highlight the threat of abuse. I mean, the other threat I can think of is let's say there is a demonstration against police brutality. And there is an individual officer who takes issue with the statements, First Amendment-protected activity, of a certain protester. That officer can then access, in some cases, ALPR data and find out where the person goes, who he sees. He can then basically reverse time and look and see if the car has done anything illegal, has been anywhere it shouldn't have been. And it's that sort of thing, the information it lets police build and store for later use that worries me.

DAVIES: So your take is, if you scoop all this stuff up, if the license plates show no reason for retention, it has to be scrapped quickly, right? And the rules are all over the place?

FASMAN: The rules are all over the place with a lot of this technology because it's so new, because it changes so quickly. And the rules are all over the place. The degree of oversight that individual police departments have is all over the place. In general, you know, even when there are regulations, there often aren't penalties for violating them or not strong enough penalties. And so it's that sort of thing. This is not - I have not written an anti-technology book. I have written a pro-democracy, pro-regulation book.

DAVIES: We need to take a break here. Let me reintroduce you. We are speaking with Jon Fasman. He is the U.S. digital editor for The Economist. His new book is "We See It All: Liberty And Justice In An Age Of Perpetual Surveillance." We'll continue our conversation in just a moment. This is FRESH AIR.

(SOUNDBITE OF JOAN JEANRENAUD'S "AXIS")

DAVIES: This is FRESH AIR. And we're speaking with Jon Fasman. He is the U.S. digital editor for The Economist magazine. He has a new book which explores cutting-edge surveillance technologies employed by local police departments, often with little oversight. The book is called "We See It All." You also write about a technology called ShotSpotter. Tell us about this.

FASMAN: ShotSpotter is an acoustic sensor designed to detect the sound of gunshots. And these - I saw these in place in Newark, N.J. They often look like little, white diamonds or rectangles up on traffic light poles. And they're trained to recognize what ShotSpotter calls loud, impulsive sounds between 120 and 160 decibels. When it does hear such a sound, it sends an alert to the ShotSpotter headquarters where a human listens to it and figures out, was that a gunshot? Was it a car backfiring? When I was at ShotSpotter's headquarters in California, there was an alert caused by a truck's Jake Brake, you know, the engine brake that releases a tremendous amount of sound quite quickly. Once it hears a gunshot, it notifies the local police department. It tells the police department how many shots, where, when. And it, essentially, can dispatch officers to a - to the scene of a suspected shooting.

DAVIES: Right. Now, that sounds like it would be valuable. I mean, a lot of times, people hear gunshots and don't report them. And if police can get quickly to the scene of a shooting, it would seem there's more chances of either saving a life, apprehending a suspect, assisting a victim. How - what's the record on its performance?

FASMAN: The record on its performance is OK, I think. I mean, I'm just judging it by what the company itself has told me, which is that it has solved crimes. It has helped identify shooters and shootings. There are some places that have had less luck with it than others. But in general, it does seem to dispatch officers to a shooting. Now, I was with the Newark Police Department when we got dispatched to two shootings by ShotSpotter. And when we got there, there was nobody there. Now, it's not a panacea, obviously. But it does seem to get police officers to the scene of a shooting. And you're right, the number of gunshots that result in actual police calls is quite low. I think it's something like 9%. And so this is a way of alerting police that a shooting has happened without waiting for someone to call it in.

DAVIES: All right. So what might give you pause about recommending this for a department? What are the drawbacks or concerns?

FASMAN: Here's what I think is interesting about ShotSpotter. As I was reporting this story, I - you know, I talked to people on buses, trains and cabs. In multiple cities, multiple people from, you know, cab drivers, just people I was sitting next to on trains, people I talked to on the street, a lot of people believed that this sort of technology was being used to overhear private conversations. Now, I think that is extremely unlikely. I think there's - to my knowledge, there's never been a case that's been brought based on a conversation overheard by ShotSpotter.

Every police department, everyone from ShotSpotter said it doesn't hear conversations. It's trained to recognize loud, sudden sounds. That's not how people talk. But it's an instructive lesson in how deploying technology can be used to improve or worsen relations between police and the communities they police. I think that in too many instances, police departments approve the purchase of ShotSpotter and deploy them without doing the work of going into communities they police and saying, listen; this is what's going up on these traffic lights. Here is how it works. It doesn't hear conversation. Here's how we know it doesn't hear conversation. If you have any questions about it, please, come and talk to me - as opposed to just citizens, who often come from communities that have a long history of distrust with police built up over years for valid reasons, as opposed to those communities just seeing another piece of tech up there.

I mean, you can imagine it yourself. If you were a citizen of color who lived in a community that had a long history of distrust with the police and all of a sudden the police said this hears gunshots, you might think to yourself, what else does it hear, you know? And so it's an instruction - one of the things that I point out in my book is that technology is not good or bad in itself. But police have to be very careful about how they roll it out, have to go out of the way to gain the trust of the public, particularly in those communities that have a long history of distrust of the police.

DAVIES: You also write about another device. This is something I had never heard of. I guess the trade name is Stingray for this thing. It's a device that can simulate a cellphone tower so that when you're driving by it, your cellphone thinks it's a mobile phone tower and connects with it. What happens then?

FASMAN: Well, once your phone connects with one of these things - the trade name is Stingray. The technical term is IMSI-catcher. IMSI stands for international mobile subscriber information. Every mobile phone has a unique number. And it's that number that identifies itself to the cellphone tower to let the tower know that, hey, this is, you know, Dave Davies' phone. So if you have any messages or texts for Dave Davies, send it to this phone. And IMSI-catcher or Stingray mimics a cellphone tower. And it gets your phone to connect to it. And what happens then is that all of the metadata on your phone, that is the non-voice call data, can then be read. And that includes texts you might send, websites you might browse, who you called and how long you talked for even without knowing the actual substance of the conversation. All of that sort of thing connects to the phone.

DAVIES: And does it geolocate you, too? Does it tell...

FASMAN: Yes, and it tells you - it geolocates you.

DAVIES: So what do the police do with this information when they get it? And they must be getting thousands and thousands of cellphones coming in contact with this thing, right?

FASMAN: Yeah. And again, increasingly, it's deployed by court order. But that hasn't always been the case. And what happens is even when deployed by court order against a specific subject, the data from every other phone in that area is hoovered up. Now, again, this happens on a stakeout, too, right? If the police are staking out a suspect, they see all kinds of people walking past. The difference is they don't retain the data from all those people walking past. In the case of data hoovered up by Stingrays, that often does get kept for longer than it should. And again, this is an issue in which there's no question that Stingrays can help police catch serious criminals. But there just needs to be some regulations over when they can be used and what happens to the data hoovered up incidentally during those stakeouts.

DAVIES: Do these things have to be mounted on towers?

FASMAN: No, they do not. They fit in the trunk of a car. They're quite small. They're sort of suitcase-sized.

DAVIES: So the typical way it might be used is the police wants to track someone who they suspect is a, you know, major drug dealer or a suspect in a homicide. And they find a spot near that person and wait for the cellphone to interact with the Stingray?

FASMAN: That's right. And they're probably just paying attention to that actual cell data. But there's a lot of other data that's taken up in response. I should say, another thing that's striking about Stingrays is that their use is really shrouded in secrecy. So everything that I know about them, everything that we know about them, comes from FOIA requests, comes from court documents. Often, police that use a Stingray, they have to sign a nondisclosure agreement so they won't talk about whether they have it, how it's used. I obviously wrote to the company for an interview. They declined. So its use is quite shrouded in secrecy.

DAVIES: OK. But when they scoop up all this other data about hundreds or thousands of other cellphones, does that get retained? Do we know?

FASMAN: We don't know for sure. It could get retained. It could get thrown away. This is, again, one of the issues I'm concerned about is we just need transparency in where that data is stored, how it's used. And there need to be limits that communities set for the police and what they can do with it.

DAVIES: So this is sort of a gray area of the law. I mean, can departments just set these things up without going to court and getting a warrant?

FASMAN: I mean, this - all of this technology that I write about is a gray area of the law because it is so new. So there is often - increasingly, there's pressure from groups like the American Civil Liberties Union, the Electronic Frontier Foundation and groups that are really concerned about the dangers I write about. They have succeeded, I think, in getting some information released to the public. But again, because this technology is new, because we have historically been reluctant to regulate the police, all of these emerging technologies, it's all a legal gray area.

DAVIES: Jon Fasman is the U.S. digital editor for The Economist. His new book is "We See It All: Liberty And Justice In An Age Of Perpetual Surveillance." He'll be back to talk more after a break. And David Bianculli reviews the new four-part documentary series from HBO, "The Lady And The Dale." I'm Dave Davies. And this is FRESH AIR.

(SOUNDBITE OF MUSIC)

DAVIES: This is FRESH AIR. I'm Dave Davies, in for Terry Gross, who's off this week. We're speaking with Jon Fasman, the U.S. digital editor for The Economist magazine. His new book explores cutting-edge surveillance technologies increasingly employed by local police departments around the country, often with little oversight. His book is "We See It All: Liberty And Justice In An Age Of Perpetual Surveillance" (ph).

One technology that's not so new, but you describe some interesting uses by the police department, are drones. And you describe this pilot program by the Baltimore Police Department. You want to describe what they did, and, you know, what this company does that provide the service?

FASMAN: Sure. So the Baltimore Police Department used a company called Persistent Surveillance. And this is a technology that was born on the battlefields in Iraq, in Fallujah. The military wanted to catch people who were leaving IEDs by the side of the road. So what an engineer - there's an engineer named Ross McNutt, naval engineer. He attached low-res cameras to drones and set them to fly over huge swaths of the city sort of in loops. So it actually did keep the city under persistent surveillance. And what they found is that when an IED went off, they could locate the footage and then essentially rewind it. So they could see who planted it, where they came from, where they talked to - who they talked to, where they lived and can solve the crime that way.

The BPD decided to use a pilot program and set it over, I think, a huge, sizable swath of East Baltimore. Now, I have seen what this footage looks like, and it is absolutely true that you can't tell a single thing about a single person from the footage. Everybody just looks like little dots. The way BPD used it is it was only in response to a reported crime. So, you know, a shooting or a carjacking, they would locate that incident and basically rewind it and see who did what. It was used to solve killings in Juarez, when Juarez was in the middle of its crime wave.

So it's true that - you know, Ross McNutt was very, very adamant that you can't tell anything about any one person, and they are not using it to identify people. But that's certainly a way it could be used, right? What happens if another company attaches better resolution cameras to drones and has them fly lower? What happens if when that happens, you have a police chief who is upset about an anti-police brutality demonstration and decides to rewind the camera on the organizers of that demonstration to see what they might have done that they might be able to be picked up for?

It's that sort of thing, even if the current instance of persistent surveillance is not terribly alarming. And again, by not terribly alarming, I mean, even if citizens of a certain city are not alarmed by the prospect that having been committed - having been accused of no crime, they may be observed all the time. Even if this incarnation doesn't let police see who they are, there's nothing stopping another company from doing something that would allow police to track people and observe people all the time.

DAVIES: And even if people are represented as dots, I mean, you can imagine misuses, too. If I'm, you know...

FASMAN: Of course.

DAVIES: ...Stalking my ex-wife, I know her address. I see the dot coming out of her place, and then I see where she's headed. I mean, the interesting thing about this thing in Baltimore was that they didn't announce it to the public and it was revealed in the media. The reaction was substantial, right?

FASMAN: Yeah, it was substantial. And this is another problem with technology that I write about. It's how police roll it out that often matters. So, you know, it was a newspaper that broke the story that, you know, the entire east side of Baltimore was under perpetual surveillance. Of course, citizens were outraged. I would be, too.

Now, that pilot program stopped, and BPD reintroduced it. And they reintroduced it after extensive public hearings and extensive public discussion. And that suggests that there is, in fact, a difference between a police department saying to itself, hey, let's try this and see how it works, and a police department going to the public and saying, this is what we'd like to do. This is how the tech works. This is what the data is. This is who gets to see it. This is how long we're keeping it. What do you guys think? And then work through with the public what they want, what they're comfortable with, what is acceptable to them. When that happens, this technology that is quite frightening can often be, you know, acceptable to people.

DAVIES: And I guess it bears noting that in Baltimore and many big cities, a lot of shootings and murders go unsolved. So information that tracks where a shooter went could be valuable. Do we know anything about how useful it's been?

FASMAN: I have not looked into that. I should look into that. But I want to make one point about efficacy as justification. There are a whole lot of things that would help police solve more crimes that are incompatible with living in a free society. The suspension of habeas corpus would probably help police solve more crimes. Keeping everyone under observation all the time would help police solve more crimes. Allowing detention without trial might help the police solve more crimes. But all of these things are incompatible with living in a free, open, liberal democracy.

So when we think about these technologies and what we are willing to accept, we shouldn't just think about whether it'll help police solve more crimes because almost all of them will, at least on the margins. The question is, is it worth the cost to our privacy and liberty to implement this technology? And if so, what limits are we willing to set? What penalties do we want for failing to observe these limits? So it's really a question not just of whether the technology works, but is it worth the cost. And if it's not worth the cost, can we devise a way in which the police can have the tool that they want to solve crimes, and we can be comforted that it won't be abused, it won't be used against us, it won't be used to surveil us.

DAVIES: We're speaking with Jon Fasman. He is the U.S. digital editor for The Economist magazine. His new book about police surveillance is called "We See It All." We'll continue our conversation in just a moment. This is FRESH AIR.

(SOUNDBITE OF AVISHAI COHEN'S "GBEDE TEMIN")

DAVIES: This is FRESH AIR. And we're speaking with Jon Fasman. He is the U.S. digital editor for The Economist magazine. He has a new book about cutting-edge technologies used by police departments in surveillance efforts. It's called "We See It All: Liberty And Justice In An Age Of Perpetual Surveillance" (ph).

You write a lot about facial recognition, which is a technology that's getting better and better. It obviously poses some issues in terms of privacy. And you know - and police will typically say they don't use facial recognition as evidence in court. They use it for tips and leads. Typical situation - a serious crime is committed - you know, an assault, a shooting - security cameras catch somebody in the area. They use facial recognition to compare that to a database of mug shots. They find a suspect, and then they bring the victim before a lineup in which the suspect is there or a, you know, a photo lineup - an array of mug shots. Seems like not an unreasonable set of things to do, potentially useful. What is the concern about the way departments are using this?

FASMAN: I think there are a few concerns. One of them is about how departments use it and whether they really are hewing to that principle that they profess that this is just a lead, this is not evidence. So what does it mean when a facial recognition system generates a match? It means that the system is telling you this person on camera is probably this person, the mug shot. Now, generally, these systems will generate a list of, you know, 20, 30, 40 possible matches ranked in order. What does it mean if the police catch the 30th person on that list as opposed to the second person on the list? How did they eliminate everyone else?

I think another concern is that facial recognition systems, as they exist right now in the United States and Europe, are really bad at recognizing nonwhite people. And that is, I think, partly an artifact of the list of the range of images that they were trained on. But what that means is nonwhite people often run the risk of being falsely identified and, therefore, if not falsely accused, then at least brought into contact with law enforcement in a way that white citizens are not as often.

Also, because of the history of overpolicing communities of color, there's the risk that databases will be more filled with nonwhite suspects than white suspects so that a nonwhite suspect stands a greater chance of being identified in a database than a white suspect does. I think that, then, until facial recognition can solve the problem of bias in this case, then we should be very, very suspicious of its deployment. I think we may even want to be sort of cautious about how it's deployed afterward.

Now, that is not a call to ban facial recognition entirely. There is a facial recognition system called Clearview that was used to identify a number of suspects in the Capitol insurrection. I think that is all to the good. The question, though, is what about the rest of us? What about our anonymity in public? What about our being put into databases without our consent? What does that mean for our privacy?

DAVIES: Right. What - who are in the photo databases that law enforcement uses to compare images that they pick up? I mean, people who have been arrested have mug shots. I mean, again, some of them may have never been convicted. What about things like - I don't know - driver's license photos? Are they used to look for matches?

FASMAN: Yes, they are used to look for matches often. As early as 2016, there is a study from Georgetown Law school that found that 1 in every 2 Americans had their faces in an FBI-accessible facial recognition database. That has almost certainly gone up since then. Also, the system that the police use to identify Capitol Hill rioters was called Clearview. One concern with Clearview, though, is that it is not just available to police departments. It is - or it was at some point - available to investors and to some private citizens as well. What this essentially means is that you can't really be sure that you're anonymous in public anymore.

So there is a story about a New York City grocery magnate who saw his daughter out on a date with someone, wanted to know who it was, snapped a picture of the guy. And immediately, he knew who the guy was, where he worked, where he lived, that sort of thing. So that's, I think, what's worrying about unregulated facial recognition is the extent to which it imperils your ability to be anonymous in public.

DAVIES: You know, it's now used by some airlines for entering, you know, planes there at the gate. Right? Rather than going through and scanning your boarding pass, you can just show them your face. You've encountered that. Your advice is don't do this, don't normalize this. Why?

FASMAN: That is my advice. I never, ever use facial recognition to board at gate. And I would advise anyone else who is concerned about our civil liberties not to do it. And that's not because you run some great risk of being, you know, pulled off a plane and detained falsely when you board a plane. It is because the more you opt to use facial recognition in ways that you think are benign, the more it will be used in ways that you may not think are benign.

So the reason to avoid using facial recognition to board a plane is that you don't want to normalize this technology at this stage of its existence. You don't want to normalize the fact that your face is - becomes sort of your passport and everywhere you go, you're tracked and recorded. You want to do your best - or at least I think people concerned about civil liberties should want to do their best to make sure that this technology is only used in a specific set of circumstances and a specific set of ways.

DAVIES: Another thing you write about is the use of algorithms by courts and police departments. One reason is to decide who ought to be held on bail. You have all these factors - you know, the defendant's past criminal record, age, et cetera. And then there are also algorithms that are designed to guide departments on where to emphasize police patrols. What are your concerns about this kind of technology? It's not exactly surveillance.

FASMAN: It's not exactly surveillance. My concern about predictive policing programs - and what I mean by that are these are programs that ingest an enormous amount of historical crime data and say, based on the data, based on past practice, these are the areas that we think are likely to be at elevated risk for crime today. So this is where you need to deploy your patrol officers. My concern about that is that historical crime data is not an objective record of all crimes committed in a city. It is a record of crimes that the police know about. And given the sort of historic pattern of overpolicing minority communities, overpolicing poor communities, these programs run the risk of essentially calcifying past racial biases into current practices.

One of the justifications for these programs is that they remove sort of human error, they remove human bias and they're just using data. And you know, it is true that they are not as reliant on human interpretation as sort of past practices may have been. But I think it's important for people to understand there isn't really any such thing as just data. So if one of these predictive policing programs ingests and makes recommendations based on, you know, nuisance crimes, vagrancy, public drinking, things that only get prosecuted because the police are there and police are more present in minority communities and prosecute these sorts of crimes more often there, then you will get a program that essentially is a pernicious feedback loop of racial bias.

DAVIES: The overriding theme of your book is that, you know, we're not going to roll back technology. It is here, but we as a democracy need to understand it and impose controls that preserve our civil liberties and, you know, balance that against the legitimate needs of law enforcement. And you cite a potential model here, and that's Oakland, Calif., which certainly has a troubled history between its police department and African American activist organizations. The - this process began with the Port of Oakland wanting to adopt a high-tech security system. What happened then?

FASMAN: That's right. So this is back in 2014. And the port - I think it's called the Domain Awareness System. The port wanted to basically integrate camera feeds from all different city systems - police, fire, public schools. All these things were going to be integrated into one. And, you know, that was happening right as the Snowden story was breaking, and it happened in Oakland, which, as you say, is a city that has a long and justifiably troubled relationship with the police force. So citizens started showing up to public meetings and demanding to know more about how the system would work, opposing the system.

From that, there grew an entity called the Oakland Privacy Commission, which is attached to the city council. And it is a commission made up of citizen volunteers, and its job is to evaluate any technology the city uses that has the potential to accrue citizen's private data and to ensure that there are adequate policies regarding its use, regarding retention, regarding sort of misuse, penalties for misuse. And it's funny. When I went out there, I expected to find that the Privacy Commission and the police department were just at loggerheads all the time. What I found was exactly the opposite.

The Privacy Commission has never told the police that they can't use a certain piece of technology, but they have ensured that the police explain what they want to do, explain why they need it and set standards for how it's going to be used and report, I believe quarterly, on how often it was deployed and in what circumstances. And I had the police tell me, look - these guys make us think about what we want. They save us from predatory vendors. They ensure that, you know, citizens have a voice in how they're policed. And it has improved relations between police and the local communities.

DAVIES: You know, that's such a contrast to so many efforts in which civilian review boards, you know, seek to monitor and regulate some police conduct. And they tend to be very fractious relationships. And the civilian boards, over time, sort of become these advisory groups with little impact. Is the difference here that you had a department that just saw the value? Or was it the interaction, the practice itself, that made the difference?

FASMAN: I think it's a bit of both, but I think it's the interaction and practice that had a lot to do with it. This is a group that has a very specific remit. It is technology. Tell us what you want. Tell us why you want it. Tell us how you're going to use it. Report on how you've used it. As long as we do that, as long as we know what you're doing and why, everything's going to be fine. And of course, there are sort of - you know, there exigent circumstance exceptions. If the police want to deploy a drone to chase someone who's just shot somebody so they can see where he's going, of course they can do that; they just have to explain it afterwards.

And I think it's that practice, the sort of - the reporting on the part of the police and the willingness of the Privacy Commission to listen and create best practices, and it's that practice that has gotten people talking to each other. And I think that's true, honestly, across the board. You know, I know privacy activists like to complain about the police; police complain about privacy activists. But I really think if you get them in a room talking to each other, instead of, you know, yelling at the TV cameras or whatever - if you get them in a room talking together, I really think in almost every instance, you know, 90% to 95% of things they can agree on. And then those last 5%, it's going to be difficult, but it starts from the ground of a working relationship rather than a poisonous one.

DAVIES: Jon Fasman, thank you so much for speaking with us.

FASMAN: My pleasure.

DAVIES: Jon Fasman is the U.S. digital editor for The Economist. His book is "We See It All: Liberty And Justice In An Age Of Perpetual Surveillance." Coming up, David Bianculli reviews the new four-part documentary series from HBO "The Lady And The Dale." This is FRESH AIR.

(SOUNDBITE OF MUSIC)

DAVE DAVIES, HOST:

This is FRESH AIR. Beginning this Sunday, HBO premieres a four-part documentary series executive produced by the Duplass Brothers. It's called "The Lady And The Dale." Its title comes from the Dale, a bold, new three-wheeled car introduced but never produced in the 1970s, and the person who was behind it. Our TV critic David Bianculli warns that very little about this documentary is what it seems - not the subject, the structure or even the storytelling. And he likes it that way. Here's his review.

DAVID BIANCULLI, BYLINE: "The Lady And The Dale," a new HBO documentary miniseries co-directed by Nick Cammilleri and Zackary Drucker, is promoted by the network with most of its secrets held in check. Tune into this nonfiction biography series, the promos suggest, and learn the tale of a female automobile executive who took on the Detroit automakers and tried to market a gas-efficient car at the height of the oil crisis. But "The Lady And The Dale" is so much more than that.

Yes, the second installment of this four-part series is mostly about the weird three-wheeled car called the Dale, but the first hour of this miniseries is mostly about a small-time but enterprising con artist. The third hour is about a trial for grand larceny, with a defendant doing double duty as her own defense attorney. And the fourth hour is - well, there are so many surprises and revelations in the fourth hour, let's not even go there. The promos held their cards close to the vest, but the opening minutes of "The Lady And The Dale" hint at some of the intrigue to come. It starts with a vintage 1970s clip from "The Price Is Right," where one of the prizes being offered is a brand new Dale automobile and works quickly through some other period TV footage that adds many layers of mystery to the story.

(SOUNDBITE OF MONTAGE)

JOHNNY OLSON: These fabulous prizes may go to these people tonight if they know when the price is right.

(APPLAUSE AND CHEERING)

OLSON: And now, here's the ultimate - a three-wheeled car.

(APPLAUSE AND CHEERING)

OLSON: It's the Dale, a whole new concept in automotive design - a three-wheel chassis with a high-impact plastic resistant resin (ph) body, top speed of 85 miles per hour. For comfort and economy, it's the Dale, by Twentieth Century Motor Car Corporation.

UNIDENTIFIED REPORTER #1: A rough-talking promoter named Elizabeth Carmichael. She's president and prime mover behind the Dale.

UNIDENTIFIED REPORTER #2: The miracle is in the mileage - a promise of 70 per precious gallon.

ELIZABETH CARMICHAEL: We've taken a total concept, integrated it and built a whole new means of transportation.

UNIDENTIFIED REPORTER #3: Mrs. Carmichael is no pessimist.

UNIDENTIFIED REPORTER #4: ...Think you're going to be able to take on GM?

CARMICHAEL: We're going whip GM.

UNIDENTIFIED REPORTER #5: A couple of nights ago on the television show "The Price Is Right," the grand prize was a three-wheel Dale automobile. That embarrassing show was tapped several months ago, when the Dale was getting reams of publicity. But is the Dale all it's been cracked up to be? Now, state officials tell us that Mrs. Carmichael's fingerprints match those of a notorious con man. Mrs. Elizabeth Carmichael is actually Mr. Jerry Dean Michael. And the secret life of Jerry Michael is the stuff of wildly imaginative adventure novels.

(SOUNDBITE OF MUSIC)

BIANCULLI: And what a story it tells. It's the story of Jerry Dean Michael, who grew up in small-town Indiana and eventually became a con artist, working every scheme from counterfeiting money and checks to selling bogus get-rich-quick schemes door-to-door. He married and had five kids, but kept grifting. And every time the law got close, he'd pick up the family and move - over one stretch, according to the documentary, 21 times in three years.

Up to this point, "The Lady And The Dale" has the flavor and momentum of "Catch Me If You Can," that Steven Spielberg movie with Tom Hanks on the trail of a teenage con man played by Leonardo DiCaprio. But then, before Episode 1 is over, Jerry embraces the identity and life of a trans woman. Jerry becomes Liz, and the kids are raised to call her mom. And life in this series goes on from there. She adopts the name Elizabeth Carmichael, starts the Twentieth Century Motor Company and introduces the Dale.

Eventually, Elizabeth Carmichael is arrested for fraud and goes on trial to defend the lack of production of the Dale. But in the '70s, with transgender issues so relatively unfamiliar and widely misunderstood, she goes on trial in other ways, too - especially in the media. And thanks to lots of old clips and some shockingly candid interviews, this close look at the media treatment of her is one of the most valuable aspects of "The Lady And The Dale." The narrative manages to namedrop such pioneering transgender public figures as Christine Jorgensen and Renee Richards, but they're far more than footnotes and become part of this story. And by Part 4, when the NBC series "Unsolved Mysteries" gets into the act, you not only feel for Liz Carmichael, but get a strong sense of unfair media bias in some very palpable specific examples.

One more thing about "The Lady And The Dale." It uses animation more and more imaginatively than almost any documentary I've ever seen. When the filmmakers have family members and friends and enemies all talking about Liz, but don't have footage other than those talking heads, stiff but lively animation is used. And while the Duplass Brothers, Mark and Jay, are among this show's executive producers, special credit should be given to the director of animation, Sean Donnally. Watch "The Lady And The Dale," and you'll see why he deserves mention and why the program itself deserves attention.

DAVIES: David Bianculli is a professor of television history at Rowan University in New Jersey and editor of the website TV Worth Watching. He reviewed the new HBO documentary series, "The Lady And The Dale," beginning Sunday on HBO.

On tomorrow's show, Washington Post reporter Craig Timberg considers the impact of President Trump's departure on the QAnon movement and other extremist groups. Followers of QAnon conspiracy theories long believed that President Trump would defeat and imprison the deep state actors and pedophiles betraying America. That didn't happen, and it's shaken the beliefs of some in the movement. I hope you can join us.

(SOUNDBITE OF BRUCE HORNSBY'S "BACKHAND")

DAVIES: FRESH AIR's executive producer is Danny Miller. Our engineer and technical director is Audrey Bentham. Our interviews and reviews are produced and edited by Amy Salit, Phyllis Myers, Roberta Shorrock, Sam Briger, Lauren Krenzel, Heidi Saman, Therese Madden, Ann Marie Baldonado, Thea Chaloner, Seth Kelley and Kayla Lattimore. Our associate producer of digital media is Molly Seavy-Nesper. Roberta Shorrock directs the show. For Terry Gross, I'm Dave Davies.

(SOUNDBITE OF BRUCE HORNSBY'S "BACKHAND")

Transcripts are created on a rush deadline, and accuracy and availability may vary. This text may not be in its final form and may be updated or revised in the future. Please be aware that the authoritative record of Fresh Air interviews and reviews are the audio recordings of each segment.

You May Also like

Did you know you can create a shareable playlist?

Advertisement

Recently on Fresh Air Available to Play on NPR

52:30

Daughter of Warhol star looks back on a bohemian childhood in the Chelsea Hotel

Alexandra Auder's mother, Viva, was one of Andy Warhol's muses. Growing up in Warhol's orbit meant Auder's childhood was an unusual one. For several years, Viva, Auder and Auder's younger half-sister, Gaby Hoffmann, lived in the Chelsea Hotel in Manhattan. It was was famous for having been home to Leonard Cohen, Dylan Thomas, Virgil Thomson, and Bob Dylan, among others.

43:04

This fake 'Jury Duty' really put James Marsden's improv chops on trial

In the series Jury Duty, a solar contractor named Ronald Gladden has agreed to participate in what he believes is a documentary about the experience of being a juror--but what Ronald doesn't know is that the whole thing is fake.

There are more than 22,000 Fresh Air segments.

Let us help you find exactly what you want to hear.
Just play me something
Your Queue

Would you like to make a playlist based on your queue?

Generate & Share View/Edit Your Queue