The spy plane and the "problem of experimentation"
A few years ago, people in Baltimore - if they happened to look up - might have noticed a small plane circling overhead. What they wouldn’t have known is that the plane was taking photos of them, and in fact their entire neighborhood, through a wide-angle lens. One photo every second, all day.
This was the infamous “spy plane” from Persistent Surveillance Systems, hired by the Baltimore Police Department, with the stated goal of reducing violent crime in the city. The idea was that when a shooting occurred in range of the spy camera, police would be able to access the footage and then scroll backwards in time to see where the shooter had started out – what car they were in, what house they had driven from – in order to establish them as a suspect.
The promise of the plane footage was that police would be working from purely objective data – camera images – rather than conversations (or lack thereof) with people on the ground. Many citizens didn’t trust the police, so – the thinking went – cameras would provide something people could trust.
As it turned out, the spy plane was not a silver-bullet solution. Camera images, once downloaded to analyst terminals, still needed to be interpreted. Although the images were high-resolution, they really didn’t contain much visual detail for an individual car or pedestrian. A car might appear not much bigger than a smudge in the photo (as shown in the book cover image below).
Tracking a suspect’s car backward in time was particularly challenging when, in traffic, the car disappeared behind buildings or trees for several frames. Analysts, doing their best to be accurate, were often unsure whether they had in fact identified the correct car, or person, throughout the sequence of photos. This raised the specter of false positives, or mis-identifications, which could put innocent people at risk of a traumatic or violent confrontation with police.
And then there was the fact that the spy plane’s cameras were trained mostly on the Black neighborhoods of Baltimore, stemming from the claim that the spy plane would help solve violent crime. (Later research showed that this claim wasn’t really true, as the cameras were mainly effective with low-level crimes.) This meant that the most vulnerable people in the city were subject to surveillance and any other outcomes, like mis-identification, from a largely experimental technology.

The story from Baltimore is told in Spy Plane: Inside Baltimore’s Surveillance Experiment, a new book by Ben Snyder, a sociology professor at Williams. The key word is “experiment”: as Snyder writes, the “problem of experimentation” is common in police technology, which is often deployed on marginalized communities that lack the political capital to resist or appeal those systems.
Don’t make the mistake of thinking this has nothing to do with you. Whatever is launched on the vulnerable eventually spreads to everyone else. Did you see the article this week about how the New York Police Department is starting to use drones to watch citizens from the air? As one NYPD official put it, “when they’re not responding to 911 calls, I want the drone team to be on patrol like a regular police car.”
And it’s not just police tech. As Snyder writes,
this kind of experimentation isn’t just limited to the crime technology market . . . it’s a core part of the ‘move fast and break things’ culture of many tech startups.
All around us we see the unchecked growth of tech products and platforms, bringing with them the usual consequences: loss of privacy, asymmetry of power, environmental degradation, and so on. Spy Plane serves as a good introduction to how experimental technologies are designed, why they are so enthusiastically adopted (at first), and what we can do about it.
You might also like my recent interview with Spy Plane author Ben Snyder on this week’s Techtonic:
Episode page with links and listener comments
. . . and, as always, you can find more Techtonic at the website: techtonic.fm.
This week on the Forum
On our members-only Creative Good Forum, we have recent topics on...
the latest in surveillance glasses from Meta/Facebook and Snap
a list of risks from generative AI, compiled by members and me
an important announcement for users of Amazon’s Alexa surveillance devices: Everything you say to your Echo will be sent to Amazon starting on March 28.
a new type of mobile spyware, called Paragon, joining others like Pegasus in threatening anyone with a smartphone
recent research and news on tech and kids
My weekly column is only one small slice of the resources Creative Good makes available to members. Please join us today to get full access to the Forum. You’ll also be pitching in to help fund the work I do on this newsletter.
Until next time,
-mark
Mark Hurst, founder, Creative Good
Email: mark@creativegood.com
Podcast/radio show: techtonic.fm
Follow me on Bluesky or Mastodon
P.S. You’re on an unpaid subscription. Please upgrade your subscription to show your support.