Far from loosening incarceration’s grip, modern tools like predictive policing and tracking apps have instead deepened the carceral state’s influence over everyday life.
This article is part of Abolition for the People, a series brought to you by a partnership between Kaepernick Publishing and LEVEL, a Medium publication for and about the lives of Black and Brown men. The series, which comprises 30 essays and conversations over four weeks, points to the crucial conclusion that policing and prisons are not solutions for the issues and people the state deems social problems — and calls for a future that puts justice and the needs of the community first.
From everyday apps to complex algorithms, technology has the potential to hide, speed, and deepen discrimination, all while appearing neutral and even benevolent when compared to overtly racist practices of a previous era. Predictive policing programs, criminal risk assessment tools, and electronic ankle monitors are a few of the tools that perpetuate the injustices of the U.S. criminal legal system, or what I call the New Jim Code. The good news is that with the rise of the New Jim Code, many individuals and organizations are developing abolitionist tools as part of a larger data justice movement — challenging surveillance technologies that harm communities and designing interventions that foster collective well-being. Struggles over abolitionist futures are being waged not just in the streets, but on our phones, apps, and platforms.
For example, Appolition is an app that converts your daily change into bail money to free Black people from jail. (Calls for abolition are never simply about bringing harmful systems to an end but also envisioning new ones.) When Appolition co-founder Kortney Ziegler and I sat on a panel together at the 2018 Allied Media Conference, he pointed out the existence of similar technologies that present themselves as liberatory but whose creators do not share an abolitionist commitment. At the time, Jay-Z’s Roc Nation had invested in a “decarceration startup” called Promise, which aims to address the problem of pretrial detention for people who cannot afford bail. But among its other features, Promise is also in the business of digitally tracking individuals to ensure they meet court appointments, show up for drug tests, and other forms of supervision as part of individual “Care Plans” — a euphemism if there ever was one.
In a piece on the website for BYP100 (Black Youth Project 100), an organization focused on transformative leadership development, direct action organizing, advocacy, and education, writer Alyxandra Goodwin described Promise as a harbinger of the Prison Industrial Complex’s next iteration. “The digital sphere and tech world of the 2000s is the next sector to have a stronghold around incarceration,” she wrote, “and will mold what incarceration looks like and determine the terrain on which prison abolitionists have to fight as a result.”
If both Appolition and Promise help people who cannot afford bail get out of cages, why is Promise a problem for those who support prison abolition? Because it creates a powerful mechanism that makes it easier to lock people back up — and because, rather than turning away from the carceral apparatus, it extends that apparatus into everyday life.
While not being locked in a cage is an improvement, the alternative is a form of coded inequity and carceral control, and it is vital that the people committed to social justice look beyond the shiny exterior of organizations that peddle such reforms.
Whereas the money crowdfunded for Appolition operates like an endowment that is used to bail people out, Promise is an investment in and collaboration with law enforcement. The company, which received $3 million in venture capital, is not in the business of decarceration but is part of the “technocorrections” industry, which seeks to capitalize on very real concerns about mass incarceration and the political momentum of social justice organizing. Products like Promise make it easier and more cost-effective to track and reimprison people for technical violations like missing a court appointment or a drug test.
Promise, in this way, is exemplary of the New Jim Code; it is dangerous and insidious precisely because it is packaged as social betterment. For-profit prison conglomerates such as The Geo Group and CoreCivic (formerly Corrections Corporation of America, or CCA) are proving especially adept at reconfiguring their business investments to create similar misdirection, leaving prisons and detention centers and turning to tech alternatives like ankle monitors and other digital tracking devices. In some cases, the companies that hold lucrative government contracts to imprison asylum seekers are the same ones that ICE hires to provide social services to these very people, even as they continue to be monitored remotely. While not being locked in a cage is an improvement, the alternative is a form of coded inequity and carceral control, and it is vital that the people committed to social justice look beyond the shiny exterior of organizations that peddle such reforms.
A key tenet of prison abolition is that caging people works directly against the safety and well-being of communities because jails and prisons do not address the underlying reasons why people harm themselves and others — in fact, they exacerbate the problem by making it even more difficult to obtain any of the support needed to live, work, and make amends for harms committed. But in the age of the New Jim Code, and as abolitionists have long argued, our vision must extend beyond the problem of caging to our consideration of technological innovations marketed as supporting prison reform.
It is vital to divert money away from imprisonment to schools and public housing if we really want to make communities stronger, safer, and more supportive for all their members. But, as abolitionist organization Critical Resistance has argued, simply diverting resources in this way is no panacea because schools and public housing as they currently function are an extension of the Prison Industrial Complex: Many operate with a logic of carcerality and on policies that discriminate against those who have been convicted of crimes. Pouring money into them as they are will only make them more effective in their current function as institutions of social control.
We have to look beyond the surface of what they say they do to what they actually do, in the same way in which I am calling on all of us to question the tech industry’s “do good” rhetoric. This requires us to consider not only the ends but also the means. How we get to the end matters. If the path is that private companies, celebrities, and tech innovators should cash in on the momentum of communities and organizations that challenge mass incarceration, the likelihood is that the end achieved will replicate the current social order.
“To see things as they really are, you must imagine them for what they might be,” said the late legal and critical race scholar Derrick A. Bell, urging a radical assessment of reality through creative methods and racial reversals. Take, for instance, a parody project that begins by subverting the anti-Black logics embedded in new high-tech approaches to crime prevention. Instead of using predictive policing techniques to forecast what some might call “street crime,” the White Collar Early Warning System flips the script by creating a heat map that flags city blocks where financial crimes are likely to occur. The system not only brings in to view the hidden but no less deadly crimes of capitalism and the wealthy’s hoarding of resources, but includes an app that alerts users when they enter high-risk areas to encourage “citizen policing and awareness.”
Taking it one step further, the development team for the White Collar Early Warning System is working on a facial recognition program meant to flag individuals who are likely perpetrators, and the training set used to design the algorithm includes the profile photos of 7,000 corporate executives downloaded from the popular professional networking site LinkedIn — subverting the documented racism of A.I. algorithms and predictive policing by using a corpus of data that is largely white and male. By deliberately and inventively upsetting the status quo, analysts can better understand and expose the many forms of discrimination embedded in and enabled by technology. Together we must critically examine the progressive narratives that surround technology, shining a light on how technical fixes for social problems can perpetuate racism, as we continue to seed an abolitionist world where we can all thrive.
[Originally published on October 23, 2020 via Medium]