Matthew Mitchell doesn’t want my phone number. He knows that if he got it, he could hack into my digital life. “It’s just easier for everybody,” he says as we begin our interview. “I’m not going to let you expose yourself.”
This is not a boast. Mitchell, who is in his forties, wants me to grasp how easily I can be hacked, spied on and have my civil liberties breached. He’s only agreed to video-chat with me from New York if I download a little-known encrypted communications app, Wire.
When I do, Mitchell tells me about his fear of governments and law enforcement aided by powerful tech companies and why he wants to help fight back.
A security researcher by day, Mitchell also co-hosts “crypto parties”. The grassroots movement began in 2012 and consists mainly of technologists leading free workshops that teach people how to use the internet anonymously. (The “crypto” in this case refers to the online anonymity you learn to achieve; the “party” is a matter of personal definition.)
In 2013, Mitchell put his own spin on proceedings by setting up CryptoHarlem, a non-profit that hosts crypto parties at a packed community centre in the uptown New York City neighbourhood. These are targeted to help communities of colour achieve digital privacy and combat high-tech surveillance police often deploy at protests.
At CryptoHarlem’s weekly gatherings, which have taken place over livestream during the pandemic, dozens of attendees learn how to use encrypted communication apps and tactics to make it more difficult for computer recognition systems to track them, for example. This education is all the more urgent, Mitchell says, as police increasingly wield overhead surveillance drones and “stingrays”, van-mounted antennae that track mobile phones at protests.
“We’re living in a Terminator sci-fi world,” says Mitchell, who is also a technology fellow at the philanthropic Ford Foundation and works with the Movement for Black Lives. “It’s an effort to teach folks who are living in a dystopian future about the harms of surveillance and how technology is happening to them.”
Mitchell isn’t alone in promoting the delicate craft of “anti-surveillance”. In the eight years since the Edward Snowden leaks revealed the breadth of mass surveillance, public anxiety about privacy from the prying eyes of authorities has steadily grown. The quasi-militarised and highly technological police response to various mass protest movements of the past decade, from Occupy Wall Street to Black Lives Matter, has only fuelled concern among activists and civil-society groups.
Meanwhile, surveillance as a market has exploded: analysts at the Business Research Company estimate that surveillance tech will grow globally from $83bn in sales in 2020 to $146bn by 2025.
Crypto parties are just the start. Fired by what they see as unchecked state power, activists have started packing their tool kits with fringe technologies of their own.
The history of surveillance is a history of technology. The “lantern laws” of 17th- and 18th-century America, for example, required slaves to carry a lantern or candle if they were walking at night without a white owner. They ensured slaves were visible — and therefore trackable — writes University of Texas professor Simone Browne, in her book Dark Matters: On the Surveillance of Blackness. That connection carried on through fingerprinting in the 19th century and the American domestic spying scandals of the 1970s to Snowden in 2013.
Today, the state’s technological force is on show. At last summer’s Black Lives Matter protests in the US, drones, planes and helicopters circled over 15 cities, gathering at least 270 hours of footage on behalf of the Department of Homeland Security. During the marches, the investigative online publication The Intercept reported that police had used special access to social media surveillance tools that monitor protesters’ online discussions in real time.
At some point, the surveilled started fighting technology with technology. Take the apps Telegram and Threema, which offer ephemeral messaging that erases records of conversations after they’ve taken place. Once favoured by drug dealers and organised crime, they’ve both become more mainstream.
Telegram and Threema were in the toolkit wielded by Kalaya’an Mendoza, a 42-year-old activist who works for Nonviolent Peaceforce, a self-described peacekeeping group, as he took part in protests in mid-April after the police killing of Daunte Wright. At the marches outside the Brooklyn Center Police Department, Mendoza also used Riseup Pad, a collaborative text document application, as an alternative to Google Docs.
“Most folks are not organising high-profile actions or holding information about activists on those tools,” he says, because activists do not trust mainline technology companies to be “secure”.
Even when communications can be kept entirely private, police may still potentially tap into metadata, according to Alan Woodward, an encryption expert and professor at the University of Surrey. This vast amount of contextual information about a given message — such as time sent, location, the host website of any links shared and picture or file names — can help law enforcement track people.
Matthew Mitchell, who hosts ‘crypto parties’ teaching digital privacy
Kate Bertash, whose clothing disrupts automated licence plate readers
Adam Harvey, whose ‘CV Dazzle’ project confuses facial-recognition software
In response, some protesters have adopted another new tactic called steganography — hiding secret messages inside non-secret messages. Typically, this involves using a tool to embed your secret message into a seemingly innocuous file such as an image, video or audio clip. Steganography replaces “the unused or useless data of a regular computer file” with an invisible message, according to the Infosec Resources website. The receiver then uses that same software, or a specific command, to decode and reveal the hidden information. “You can see who’s talking to who, but you have no idea that it’s relevant,” says Woodward. “It’s hidden in plain sight.”
Another tool being developed by activists is mesh networks. In June 2019, the founder of Telegram claimed that China was behind a cyber attack that hindered its app for several hours during the Hong Kong protests. This prompted activists to explore ways to create interconnected groups of devices that can communicate with one another without going through a centralised node, such as a cell tower or WiFi hub which a government can tap into.
“During Occupy Wall Street, there were a lot of ideas around mesh networks and off-grid offline communication, both to [help] when they shut down the internet and also to hide yourself,” says Nathan Freitas, a developer and director of the Guardian Project, which creates secure apps and open-source software libraries for privacy-conscious activists. “There was a dream . . . that we would be like a flock of birds in the streets, chirping at each other, and you could have the equivalent of a Twitter happening without needing cell-phone towers.”
William Gibson called it “the ugliest T-shirt in the world”, a garment so grotesque that it confounds surveillance cameras and renders its wearer invisible. Gibson’s Zero History was a work of science fiction, but academics and activists are trying to make something like it reality. Attempts to inject optical hacks into make-up and clothing, dubbed “stealthwear” or adversarial fashion, show anti-surveillance kit may become physical as well as digital.
LA-based Kate Bertash heads the Digital Defense Fund, a non-profit that provides security for abortion-rights advocates. Following concerns that protesters outside abortion clinics were photographing vehicles and could potentially run them through automated licence plate readers (ALPRs) to identify individuals, she created ALPR-busting stealthware in 2019.
In the US, ALPRs mounted on cars or road signs are commonly used by law enforcement to capture licence plates and compare them to state or federal “hot lists” of vehicles thought to be used by criminals. Plate scanners were a key tool used by the FBI to find and arrest participants in the January 6 insurrection in Washington, DC. Because the tactic is loosely regulated, many law-enforcement agencies share their data freely with others that use the same private-sector software providers. This has allowed them access to billions of data points of location information.
So Bertash adorned hoodies, shirts and dresses with fake licence plates and other kitsch designs. She found that the optical character recognition technology would pick up all the dummy plates from the clothes, becoming overwhelmed and confused. “The clothing is throwing more junk into [the image],” she says. “You’re poisoning the quality.”
One of the pioneers of design-based defence is Adam Harvey, an American mechanical engineer-turned-artist and activist who now lives in Berlin. In 2010, Harvey launched a project known as CV Dazzle, exploring ways camouflage could be wielded to disrupt “computer vision,” or how software interprets and analyses the information contained in images or video.
Facial recognition algorithms work by scanning for detectable geometries — the distance from forehead to chin, or eye to eye — and matching them to an existing data set. Harvey’s human models sport Cubist splodges of face paint and jagged, asymmetric haircuts so the algorithms lose all sense of where their features are. He dubs this the “anti-face”.
The basic concept isn’t new: dazzle camouflage, consisting of complex geometric patterns in bright colours, was used to hide ships in the first world war. But Harvey’s project has inspired others, such as the UK’s Dazzle Club collective, who don face paint to conduct monthly silent marches through London and other British cities in protest against the police’s use of CCTV. “The procurement of more and more cameras and all the associated paraphernalia is about capitalism at its best: ‘You’re scared. We will protect you,’” says London-based artist Anna Hart, one of Dazzle Club’s founders.
Sartorial hacking of this kind undoubtedly remains in its infancy, constituting more of a performative rebellion against surveillance. Where Harvey’s projects are “designed as a provocation”, Bertash says she is making a statement that, “You do not have to consent to [surveillance] all the time.”
There are still technological limitations. Harvey, Hart and Bertash all acknowledge that their make-up or outfits were initially designed to upset one specific algorithm and that newer ones are always developing. What’s more, biometric detection systems can already adapt to recognise subjects with modifications — such as make-up — if enough photos of the same subject are fed into an algorithm’s training data.
Then there’s the kind of irony unique to dystopian fiction. “You make yourself hyper-visible,” says Harvey. “Until that point that everyone has painted their face in the crowd, then actually it could be putting you at more risk.” Still, the fact that hobbyists can even temporarily thwart costly machine-learning systems raises serious questions about how dependable those systems — upon which the outcome of court cases may depend — really are.
How quickly counter-measures are nerfed by counter-counter-measures is hard to assess. (Often, it is only through leaks or hacks that government and law-enforcement capabilities are revealed to the public.) But the business of surveillance is growing fast.
Faster, many argue, than the rules that will enable its eventual use. Oversight and regulation remains patchy and fragmented, differing from state to state in the US and country to country globally. “Surveillance is not being properly controlled or measured or audited,” Mitchell says. “If this was a weapon — a physical weapon like a gun or pepper spray — that would be, ‘No, you can’t do it.’ But because it [might be] a digital weapon, it’s quite easy to hide.”
As long as there is a free marketplace, surveillance technologies are bound to be developed and deployed. For anti-surveillance advocates, it may be hard to keep pace and ensure that their toolkit remains accessible.
Already “the [anti-surveillance] technology that I’m an expert in that I help people use is very inconvenient”, says Mitchell, noting some of the software typically wasn’t designed by the marginalised communities that need it most. He concedes that over-surveillance cannot be overturned solely by hobbyist technologists.
Before we say goodbye and log off Wire, Mitchell tells me the public deserves more transparency and more opportunities to debate security versus privacy. “We can’t win with only directly affected people, right? You need everyone to care.”
Hannah Murphy is an FT tech correspondent in San Francisco
Follow @FTMag on Twitter to find out about our latest stories first.