To view more event details and download a flier click the thumbnail image to the right.
Transcript:
Morgan Ames: All right, thank you all so much for coming out. I am so thrilled that so many of you have come to hear Ruha speak today. My name is Morgan Ames, I'm the interim Associate Director of Research at the Center for Science, Technology, Medicine and Society. One of the co-sponsors of this event. I am going to be introducing the introducer, Denise Herd, will come up and say a few words about our sponsors and also, about Ruha who we're very happy to have join us today. Thank you so much.
Denise Herd: Well, good afternoon and welcome. It's my great pleasure to be here and also to be able to introduce our special guest speaker for today's thought, Professor Ruha Benjamin. Dr. Benjamin is an associate professor of African-American studies at Princeton University. She's the founder of The JUST DATA Lab and author of two path-breaking books. The first one is the People's Science: Bodies and Rights on the Stem Cell Frontier that came out in 2013 and her new book that just came out that I hope that you got your copy of is the Race After Technology: Abolitionist Tools for the New Jim Code and she also have some other distinguished publications.
Denise Herd: Ruha's work investigates the social dimensions of science, medicine and technology with a focus on the relationship between innovation and inequity, health injustice, knowledge and power. She is the recipient of numerous awards and fellowships including one from the American Council of Learned Societies, National Science Foundation, Institute for Advanced Study and the President's Award for Distinguish Teaching at Princeton University. I also had the pleasure of getting to know Ruha through... she worked with me when she was a graduate student here at UC Berkeley and I personally can attest to her creativity and brilliance.
Denise Herd: She worked with me on a project that we were doing on the cultural and health dimensions of images of alcohol, drugs and violence and rap music. She's always been at the forefront of what's happening new in culture around African-American culture especially. I think Ruha's work is critically important for facing the challenges of creating social justice and furthering human rights in our current age of hyperincarceration as we as a nation and actually as a planet look increasingly to technology, to big data and algorithms to solve health and social problems.
Denise Herd: As a professor in the school of public health and as a person on this campus, I can tell you that there is kind of a romance with big data and with technology as something that's going to solve all of our problems and we need the equity lens on those kind of developments. Also, as some of you might be aware in this year 2019, UC Berkeley is acknowledging 400 years of African-American resistance to enslavement and oppression, since the first enslaved Africans were brought to the American colonies. After over 300 years of slavery and legal segregation or Jim Crow or what Saidiya Hartman called living in the afterlife of slavery, the US has now entered into a period of the new Jim Crow through mass incarceration of African-Americans and other marginalized peoples as described by Michelle Alexander.
Denise Herd: As we now enter the age of e-incarceration that Ruha's work is so important in referring to as a New Jim Code, abolitionist work is increasingly important and has a special place in the struggle for justice and human rights. Before I welcome Ruha to the stage to do what I know will be a really fascinating and engaging and thoughtful presentation, I'd like to take a moment now to thank our sponsors, the Center for Science, Technology, Medicine and Society, the Haas Institute for a Fair and Inclusive Society of which I'm the associate director, the CITRIS Policy Lab, The Algorithmic Fairness & Opacity Working Group.
Denise Herd: I'd like to also thank all the volunteers and staff working here with the various co-sponsors as well as here at the auditorium and I especially like to thank all of you for joining us today for this event and so now, I'm delighted to welcome Professor Ruha Benjamin to the stage, who will speak on the New Jim Code, race, carceral techno science and liberatory imagination in everyday life. Welcome Ruha.
Ruha Benjamin: Good afternoon.
Audience: Good afternoon.
Ruha Benjamin: It's good to be here. I graduated in 2008 so I get to see so many familiar faces, friends, colleagues, professors. Oftentimes when I'm back on campus, I get a little rash or itch because I think there's a deadline that I'm missing and it's being back in the place you spent all those years of being a student but it's still... it's lovely to be back. I really want to thank the folks behind the scenes. Many who I probably don't know but in particular, Takiyah Franklin for making this possible. Morgan Ames, also for inviting me and my wonderful mentor, Professor Denise Herd. Thank you so much for having me.
Ruha Benjamin: Thank you for all the co-sponsors. I often think of it as a mark of what's going to be generative discussion when you have units from around campus that don't necessarily work together on a regular basis to join together and co-sponsor something like this, so I'm excited to be here. Please join in acknowledging that the land on which we gather is a traditional and unceded territory of the Ohlone people. Let's also acknowledge the intertwined legacies, the devastation of the transatlantic slave trade and settler colonialism which contribute to the creation and continued wealth of this university and to the nation state itself.
Ruha Benjamin: We acknowledge the reparations owed to black and indigenous communities and nations and the impossibilities of return for generations past. Let's also acknowledge the ancestors in the room this afternoon as we fight together for better futures. We are alive in an era of awakening and mobilization to preserve this planet and all of the beautiful creation that is no doubt worthy of this struggle. With that let me begin with a recent experience I had being a nosy sociologist walking by two men in Newark International Airport. When I overheard one say to the other, "I just want someone I can push around." I didn't stick around to hear the end of the sentence, but I could imagine all types of endings.
Ruha Benjamin: It could be in the context of looking through resumes, deciding who to hire. I just want someone to push around at work or in the context of dating or marriage. I just want someone I can push around in my personal life. The desire to exercise power over others is a dominant mode of power that has been given new license to assert itself. The kind of power that requires others to be subordinate, though we should remember this is not the only mode or theory of power. At the time I was traveling to speak with students at Harvey Mudd College about issues of technology and power, and so when I overheard this conversation, I thought about this article and advertisement from a 1957 Mechanics Illustrated. The robots are coming and when they do, you'll command a host of push button servants, and then it says, in 1863, Abe Lincoln freed the slaves but by 1965 slavery we'll be back.
Ruha Benjamin: We'll all have personal slaves again. Don't be alarmed. We mean robot slaves, so much going on on this one little paragraph, we could spend an hour, I'm sure close reading and talking about it, but for the sake of time, I'll just point out two things. One is the date, 1957. A time when those who were pushed around in the domestic sphere, wives, domestic servants and others could no longer be counted on to dress you, comb your hair and serve you meals in a jiffy, as the ad says.
Ruha Benjamin: During World War II, many more white women enter the workforce to take up jobs formerly occupied by men who left to fight the war and blacks, men and women, most of whom worked in agricultural and domestic work also entered manufacturing workforce. Hence, the desire to replace those sources of free and cheap labor in the home with push button robots. The point is, no technology is preordained, but rather the broader context make some inventions appear desirable and inevitable. Perhaps even more telling is that we will all have personal slaves again, that one little word tells us something about the targeted audience of that. Certainly not those who are the descendants of those who were enslaved the first time.
Ruha Benjamin: The imagined user's gendered, raced and classed without gender, race, or class ever being mentioned. Code words in this case and code interlocking systems of inequality as part of the design process precisely by ignoring social inequalities. Tech designers will almost certainly reproduce it. True in 1957, true today. With that, let me offer three provocations as a kind of trailer for the talk. This way, if you have to leave early, your phone starts buzzing or you get distracted or bored, you'll know exactly what I want you to know. First, racism is productive, not in the sense of being good, but in the literal capacity of racism to produce things of value to some, even as it wreaks havoc on others.
Ruha Benjamin: We're taught to think of racism as an aberration, a glitch, an accident, an isolated incident, a bad apple in the backwoods and outdated rather than innovative, systemic, defuse, an attached incident, the entire orchard in the ivory tower, forward-looking, productive. In sociology we like to say race is socially constructed, but we often fail to state the corollary that racism constructs. Secondly, I'd like us to think about the way that race and technology shape one another. More and more people are accustomed to thinking about the ethical and social impact of technology, but this is only half of the story.
Ruha Benjamin: Social norms, values and structures all exist prior to any tech development, so it's not simply about the impact of technology, but the social inputs that make some inventions appear inevitable and desirable, which leads to a third provocation. That imagination is a contested field of action, not an ephemeral afterthought that we have the luxury to dismiss or romanticize, but a resource, a battleground, an input and output of technology and social order. In fact, we should acknowledge that most people are forced to live inside someone else's imagination and one of the things we have to come to grips with is how the nightmares that many people are forced to endure are the underside of elite fantasies about efficiency, profit and social control. Racism among other axes of domination helps produce this fragmented imagination, misery for some, monopoly for others.
Ruha Benjamin: This means that for those of us who want to construct a different social reality, one grounded in justice and joy, we can't only critique the underside but we also have to wrestle with the deep investments, the desire even for social domination. I just want someone I can push around, so that's the trailer. Let's turn to some specifics. Beginning with a relatively new app called Citizen, which sends you real time crime alerts based on a curated selection of 911 calls. It also offers a way for users to report, livestream and comment on purported crimes via the app and it also shows you incidents as red dots on a map so you can avoid particular areas, which is a slightly less racialized version of apps called Ghetto Tracker and Sketch Factor, which use public data to help people avoid supposedly dangerous neighborhoods.
Ruha Benjamin: Now, you're probably thinking what could possibly go wrong in the age of Barbecue Becky's calling the police on black people cooking, walking, breathing out of place. It turns out that even a Stanford educated environmental scientists living in the Bay Area is an ambassador of the carceral state, calling the police on a cookout at Lake Merritt. To paraphrase Claudia Rankine, the most dangerous place for black people is in white people's imagination. It's worth noting too that the app Citizen was originally called the less chill name Vigilante, and in it's rebranding it also moved away from encouraging people to stop crime, but rather now simply to avoid it.
Ruha Benjamin: As one member of the New York city council put it, crime is now at historic lows in the city but because residents are constantly being bombarded with push notifications of crime, they believe the city is going to hell in a hand basket. Not only is this categorically false, it's distracting people from very real public safety issues like reckless driving or the rising opioid use that don't show up on the app. What's most important to our discussion is that Citizen and other tech fixes for social problems are not simply about technology's impact on society but also about how social norms and structures shape what tools are imagined necessary in the first place.
Ruha Benjamin: This dynamic is what I take up in two new books. The first examining the interplay between race automation, machine bias as an extension of older forms of racial domination. The second is an edited volume on the carceral dimensions of technology across a wide array of social arenas from more traditional sites like policing and prisons to less obvious contexts like the retail industry and digital service economy. As just one example from this volume, a chapter by Madison Van Oort draws on her ethnography of worker surveillance in the retail industry where the same companies pitching products for policing and imprisonment to the department of corrections are also pitching them to H&M and Forever 21 to track employees.
Ruha Benjamin: Even as she shows how workers are surveilled well beyond the confines of their workplaces to include even their online activity, Van Oort also highlights how her coworkers use technology in ways that counter the experience of alienated labor, what we might call duplicity at work. On this point, I'd like to just pause for a minute and turn to science fiction as part of expanding our sociological imagination. This clip that I'm going to show you is from the film Sleep Dealer by Alex Rivera and it reveals how global capitalism is ever ready to turn racialized populations into automator, Mexicans not as migrant workers but machines that work in the US without setting foot in this country.
Speaker 5: In this scene you'll see the character as he crosses the border for the first time. He's in Mexico, but he comes to America in a new way.
Ruha Benjamin: In this world, migrant workers are replaced by robots who are controlled virtually by laborers in Mexico carrying out a variety of jobs in construction, childcare, agriculture, and more, not only is the tech invasive as we see, but it also allows unprecedented surveillance, so if a worker falls asleep for an instant, the computer wakes her up, registers the lapse and docks her pay, Amazon warehouses on steroids. Of course, over the course of the film Memo Cruz starts working at one such factory, which are called Sleep Dealers because workers often collapse of exhaustion when they're plugged into the network too long.
Ruha Benjamin: In this way, the film reminds us how the fantasy of some is a nightmare of others and that embodiment does not magically cease to matter with automation, but can actually become more intensified, intrusive and violent. It's worth recalling that the etymology of the Czech word robot is drawn from the Slav robota, which means servitude, hardship and as anthropologist, Kathleen Richardson observes, robots have historically been a way to talk about dehumanization.
Ruha Benjamin: Sleep Dealers also brings to life an idea that inspire the title of the volume that technology captivates. Fascinating, charming, and bewitching while potentially subduing and subjugating people. To engage this tension, we have to pierce through the rhetoric and marketing of tech utopianism as we try to understand the duplicity of tech fixes, purported solutions that can nevertheless reinforce and even deepen existing hierarchies. In terms of popular discourse, what got me interested in this tension was the proliferation of headlines and hot takes about so-called racist robots.
Ruha Benjamin: A first wave of stories seem to be shocked that the prospect that in Langdon Winner's terms, artifacts have politics. A second wave seemed less surprised. Well, of course technology inherits its creators biases and now I think we've entered a phase of attempts to override or address the default settings of racist robots for better or worse and one of the challenges we face is how to meaningfully differentiate technologies that are used to differentiate us. Take for example what we might call an old school targeted ad from the mid 20th century. In this case, a housing developer used this flyer to entice white families to purchase a home in the Leimert Park neighborhood of Los Angeles, which is where my grandparents eventually infiltrated, which was the language used at the time, but at this point in the story the developers are trying to entice white buyers only by promising them "beneficial restrictions".
Ruha Benjamin: These were racial covenants that restricted someone from selling their property to black people and other unwanted groups but then comes the civil rights movement, the Fair Housing Act of 1968 which sought to protect people from discrimination when renting or buying a home, but did it? Today, companies that lease or sell housing or jobs can target their ads to particular groups without people even knowing they're being excluded or preyed upon, and as ProPublica investigators have shown these discriminatory ads are often approved within minutes of being submitted despite Facebook's official policy.
Ruha Benjamin: Though it's worth noting that in just the last month, advocacy groups have brought the first civil rights lawsuit against housing companies for discriminating against older people using Facebook's targeted ad system and so in reflecting on the connection between the past and present, this combination of coded bias and imagined objectivity is what I term the New Jim Code. Innovation that enables social containment while appearing fair than discriminatory practices of a previous era.
Ruha Benjamin: This riff off of Michelle Alexander's analysis in the New Jim Crow considers how the reproduction of racist forms of social control and successive institutional forms entails a crucial socio-technical component that not only hides the nature of domination but allows it to penetrate every facet of social life under the guise of progress. This formulation, as I highlight here is directly related to a number of other cousin concepts by Browne, Broussard, Daniels, Eubanks, Noble and others. Situated in a hybrid literature that I think of as race critical code studies. This approach is not only concerned with the impacts of technology, but it's production and particularly how race and racism enter the process.
Ruha Benjamin: Two works that I'll just illustrate are Safiya Noble's Algorithms of Oppression in which she argues that racist and sexist Google search results like pornographic images returned, when you type in the phrase, black girls grow out of a corporate logic of either willful neglect or a profit imperative that makes money from racism and sexism. In a different vein, Simone Browne examines how the history of surveillance technologies reflect and reproduce distorted notions of blackness. Explaining that "surveillance is nothing new to black folks", from slave ships and slave patrols to airport security checkpoints and stop and frisk policing practices. She points to the facticity of surveillance in black life challenging a techno deterministic approach she argues, that instead of seeing surveillance as something inaugurated by new technologies, to see it as ongoing is to insist that we factor in how racism and anti-blackness undergird and sustain the intersecting surveillances of our present order, and so to continue examining how anti-blackness gets encoded in an exercise through automated systems, I consider four conceptual offspring of the New Jim Code that fall along kind of spectrum.
Ruha Benjamin: Engineered inequity names those technologies that explicitly seek to amplify social cleavages. They're what we might think of as the most obvious, less hidden dimension of the New Jim Code. Default discrimination or those inventions that tend to ignore social cleavages and as such tend to reproduce the default settings of race, class, gender, disability among other axes. Here I want to highlight how indifference to social reality is a powerful force that is perhaps more dangerous than malicious intent. Coded exposure highlights the underside of tech inclusion, how the invisibility or technological distortion of those who are racialized is connected to their hyper visibility within systems of surveillance and finally, techno benevolence names those designs that claim to address bias of various sorts but may still manage to reproduce or deepen discrimination in part because of the narrow way in which fairness is defined and operationalized.
Ruha Benjamin: For the sake of time, I'm just going to sketch the last three with examples. Default discrimination includes those technologies that reinforce inequities precisely because tech designers fail to seriously attend to the social context of their work. Take for example, carceral tools that underpin the US prison industry as a key feature of the New Jim Code. At every stage of the process from policing, sentencing, imprisonment to parole, automated decision systems are being adopted. A recent study by investigators again at ProPublica, which many of you are probably familiar with, examined the risk scores used to predict whether individuals were likely to commit another offense once paroled. They found that the scores which were assigned to thousands of people arrested in Broward County, Florida were remarkably unreliable in forecasting violent crime and that they uncovered significant racial disparities and inaccuracies, the outputs of the algorithm, shall we say.
Ruha Benjamin: What's also concerning, I think is how the system reinforces and hides racial domination by ignoring all the ways that racism shapes the inputs. For example, the surveys given to prospective parolees to determine how likely they are to recidivate includes questions about their criminal history, education, employment history, financial history, and neighborhood characteristics among many other factors. All of these variables have been structured in one way or another by racial domination from job market discrimination to ghettoization.
Ruha Benjamin: The survey measures the extent to which an individual's life has been impacted by structural racism without ever asking an individual's race. Color blind codes may on the surface appear better than a bias judge or prosecutor, but crime prediction is better understood as crime production because those who are making these forecasts are also the ones who are making it rain. Coded exposure in turn names the tension between ongoing surveillance of racialized populations and calls for digital recognition and inclusion, the desire to literally be seen by technology but inclusion in harmful systems is no straightforward good.
Ruha Benjamin: Instead, photographic exposures enable other forms of exposure and thus serves as a touchstone for considering how the act of viewing something or someone may put the object of vision at risk, a form of scope at vulnerability central to the experience of being racialized. What I'd like to underscore is that it's not only in the process of being out of sight, but also in the danger of being too centered that racialized groups are made vulnerable. In Alondra Nelson's terms, this is a dialectic of neglect and surveillance at work, so that being included is not simply positive recognition but can be a form of unwanted exposure, but not without creative resistance as I'll come back to in just a minute, but first one more brief interlude.
Lem: Hello? Motion sensors, I'm motioning. I'm motioning. Please sense me.
Speaker 7: One other thing, Lem mention that there's something weird going on with the motion sensors in the lab.
Veronica: Oh yeah, we replaced all the sensors in the building with a new state-of-the-art system that's going to save money. It works by detecting light reflected off the skin.
Speaker 7: Well, Lem says it doesn't work at all.
Veronica: Lem's wrong. It does work, although there is a problem. It doesn't seem to see black people.
Speaker 7: This system doesn't see black people.
Veronica: I know. Weird, huh?
Speaker 7: That's more than weird Veronica. That's basically racist.
Veronica: The company's position is that it's actually the opposite of racist because it's not targeting black people. It's just ignoring them. They insist the worst people can call it is indifferent.
Speaker 9: Nothing. We never should have let that white guy off.
Lem: We're eight black men in an elevator. Of course the white guy's going to get off. Veronica. Oh God, this looks way too aggressive.
Veronica: No, it's okay. I think I know why you're all here. Well, most of you.
Lem: I have something prepared. Veronica, you are a terrific boss.
Veronica: Thank you Lem, I'll take it from here. Let me start by apologizing on behalf of Veridian for this inexcusable situation.
Lem: I laid into Veronica pretty good. I figured it was my only shot, so I took the gloves off.
Speaker 10: Well, that sounds great Lem, sounds like you gave the company a really strong message.Lem: Oh yeah. She said they're working 24/7 to make things right. Can you believe this?
Speaker 9: I know, isn't great? We all get our own free white guys.Lem: You like it?
Speaker 9: Yeah. Hey, Ty's the best. He anticipates everything I need. Plus, he picked up my dry cleaning. Oh, and he got this kink out of my neck.
Lem: Really?
Speaker 9: Mm-hmm (affirmative).
Lem: My white guy sucks.
Speaker 9: Well, maybe you're just not using yours right.
Stu: Maybe it's on you, dude.
Lem: Shut up Stu.
Stu: I got the worst black guy.
Speaker 12: It turned out Lem had also been thinking about the money issue and he put together some interesting numbers to show us, and then we all went to speak to management in a language they could understand.
Speaker 10: Within a margin of error of plus or minus one percent. And so if the company keeps hiring white people to follow black people to follow white people to follow black people, by Thursday, June 27, 2013 every person on earth will be working for us, and we don't have the parking for that.
Ruha Benjamin: All right, so the show brilliantly depicts how superficial corporate diversity ethos, prioritization of efficiency over equity and the default whiteness of tech development work together to ensure innovation literally produces containment. The fact that black employees are unable to use the elevators, doors, water fountains or turn the lights on is treated as a minor in convenience in service to a greater good. This is the invisiblizing side of the process that Nelson describes as surveillance and neglect that characterizes black life vis-à-vis science and technology.
Ruha Benjamin: Finally, some of the most interesting developments I think are those we can think of as techno benevolence that aims to address bias in various ways. Take for example, new AI techniques for vetting job applicants. A company called HireVue aims to reduce unconscious bias and promote diversity in the workplace by using an AI powered program that analyzes recorded interviews with prospective employees. It uses thousands of data points including verbal and nonverbal cues like facial expression, posture, and vocal tone and compares job-seekers scores to those of existing top performing employees to decide who to flag as a desirable hire and who to reject.
Ruha Benjamin: The sheer size of many applicant pools and the amount of time and money that companies pour into recruitment is astronomical. Companies like HireVue can narrow the eligible pool of it at fraction of the time in cost and hundreds of companies including Goldman Sachs, Hilton, Unilever, Red Sox, the Atlanta Public School system and Moore have signed on. Another value added according to HireVue that there's a lot that a human interviewer misses that AI can keep track of to make "data-driven talent decisions". After all, the problem of employment discrimination is widespread and well-documented, so the logic goes, wouldn't this be even more reason to outsource decisions to AI?
Ruha Benjamin: We'll consider a study by Princeton team of computer scientists which examined whether a popular algorithm trained on human writing online would exhibit the same racially bias tendencies that psychologists have documented among humans. In particular, they found that the algorithm associated white sounding names with pleasant words and black sounding names with unpleasant ones, which should sound familiar for those who know the classic audit study by veteran Mullainathan, so this is building on that to consider if AI would do better than us, so too with gender coded words and names as Amazon learned last year with its hiring algorithm was found discriminating against women.
Ruha Benjamin: Nevertheless, it should be clear why technical fixes that claim to bypass human biases are so desirable. If only there was a way to slay centuries of racist and sexist demons with a social justice bot beyond desirable, more like magical, magical for employers perhaps looking to streamline the grueling work of recruitment but a curse for many job seekers. Whereas proponents describe a very human like interaction. Those who are on the hunt for jobs recount different experience.
Ruha Benjamin: Applicants are frustrated not only by the lack of human contact, but also because they have no idea how they're being evaluated and why they're repeatedly rejected. One job seeker described questioning every small movement and micro expression and feeling a heightened sense of worthlessness because "the company couldn't even assign a person for a few minutes" and as this headline puts it, your next interview could be with a racist robot. Bring us back to that problem space we started with that, but what's worth noting that some job seekers are already developing ways to subvert the system by trading answers to employers tests and creating fake applications as informal audits of their own.
Ruha Benjamin: In fact, one HR employee for a major company recommends slipping the words Oxford or Cambridge into your CV with invisible white ink to pass the automated screening. In terms of a more collective response, a Federation of European trade unions called UNI Global, has developed a charter of digital rights for workers touching on automated and AI based decisions to be included in bargaining agreements. One of the most heartening developments to me is that tech workers themselves have increasingly been speaking out against the most egregious forms of corporate collusion with state sanctioned racism and if you're interested, just check out the hashtags, tech won't build it and no tech for ICE campaigns to get a glimpse of some of this work.
Ruha Benjamin: As this article published by Science for the People reminds us, contrary to popular narratives, organizing among technical workers has a vibrant history including engineers and technicians in the '60s and '70s who fought professionalism, individualism, and reformism to contribute to radical labor organizing. The current tech workers movement, which includes students across our many institutions, can draw from past organizer's experiences in learning to navigate the contradictions and complexities of organizing in tech today, which includes building solidarity across class and race.
Ruha Benjamin: For example, when the predominantly East African Amazon workers in the companies in Minnesota warehouse organized a strike on Prime Day to demand better work conditions, engineers from Seattle came out to support. In terms of civil society initiatives like Data for Black Lives and the Detroit Community Technology Project, these offer an even more expansive approach. The former brings together people working across a number of agencies and organizations and a proactive approach to tech justice, especially at the policy level, and at the latter develops and uses technology rooted in community needs, offering support to grassroots networks, doing data justice research including hosting what they call disco techs, which stands for discovering technology, which are multimedia mobile neighborhood workshop fairs that can be adapted in other locales.
Ruha Benjamin: I'll just mention one of the concrete collaborations that's grown out of Data for Black Lives. A few years ago, several government agencies in Saint Paul, Minnesota, including the police department and the Saint Paul public schools, formed a controversial joint powers agreement called the innovation project, giving these agencies broad discretion to collect and share data on young people with the goal of developing predictive tools to identify at risk youth in the city. There was immediate and broad based backlash from the community and in 2017 a group of over 20 local organizations formed what they called, the Stop the Cradle to Prison Algorithm Coalition.
Ruha Benjamin: Data for Black Lives has been providing various forms of support to this coalition and eventually the city of Saint Paul dissolved the agreement in favor of a more community led approach, which was a huge victory for the activists who had been fighting these policies for over a year. Another very tangible abolitionist approach to the New Jim Code is the Digital Defense Playbook, which introduces a set of tools for diagnosing, dealing with and healing the injustices of pervasive and punitive data collection in data driven systems.
Ruha Benjamin: The playbook contains in depth guidelines for facilitating workshops plus tools, tip sheets, and reflection pieces crafted from in depth interviews with communities in Charlotte, Detroit, and Los Angeles with the aim of engendering power, not paranoia when it comes to technology and finally, when it comes to rethinking STEM education as the ground zero for re-imagining the relationship between technology and society, there are a number of initiatives underway, and I'll just mention this concrete resource that you can download, the Advancing Racial Literacy in Tech handbook developed by some wonderful colleagues at the data and society research Institute.
Ruha Benjamin: The aim of this intervention is threefold. To develop an intellectual understanding of how structural racism operates and algorithms, social media platforms and technologies not yet developed and emotional intelligence concerning how to resolve racially stressful situations within organizations and a commitment to take action to reduce harms to communities of color. The fact is, data disenfranchisement and domination has always been met with resistance and appropriation in which activists, scholars, and artists have sharpened abolitionists tools that employ data for liberation. This is a tradition in which as Du Bois explained, one could not be a calm, cool, and detached scientist, while Negroes were lynched, murdered, and starved.
Ruha Benjamin: From his modernist data visualizations representing the facts of black life to Ida B. Wells-Barnett's expert deployment of statistics in The Red Record, there was a long tradition of employing and challenging data for justice. Toward that end, the late critical race scholar, Harvard professor Derrick A. Bell, encouraged a radical assessment of reality through creative methods and racial reversals, insisting that to see things as they really are, you must imagine them for what they might be, which is why I think the arts and humanities are so vital to this discussion and this movement.
Ruha Benjamin: One of my favorite examples of a racial reversal in the Bellian tradition is this parody project that begins by subverting the anti-black logics embedded in new high tech approaches to crime prevention. Instead of using predictive policing techniques to forecast street crime, the White Collar Early Warning System flips the script by creating a heat map that flags city blocks where financial crimes are likely to occur. The system not only brings the hidden but no less deadly crimes of capitalism into view but includes an app that alerts users when they enter high risk areas to encourage citizen policing and awareness.
Ruha Benjamin: Taking it one step further, the development team is working on a facial recognition program to flag individuals who are likely perpetrators, and the training set used to design the algorithm includes the profile photos of 7,000 corporate executives downloaded from LinkedIn. Not surprisingly, the average face of a criminal is white and male. To be sure creative exercises like this are only comical when we ignore that all of its features are drawn directly from actually existing proposals and practices in the real world, including the use of facial images to predict criminality.
Ruha Benjamin: By deliberately and inventively upsetting the status quo in this manner, analysts can better understand and expose the many forms of discrimination embedded in and enabled by technology, and so if as I've suggested at the start, the carceral imagination captures and contains that a liberatory imagination opens up possibilities and pathways, creates new settings and codes new values and builds on critical intellectual traditions that have continually developed insights and strategies grounded in justice. May we all find ways to contribute to this tradition. Thank you.
Speaker 13: Questions. I'll let you talk about.
Ruha Benjamin: You can ask questions, but you can also offer brief reflections too in terms of anything that speaks to the work that you're doing, the things that you've been thinking about. Most Q&A's, they say, don't comment, just ask a question but the main reason why I wrote this book is to provoke conversation and thinking and so it's really useful to me to hear what you're thinking about even if it's not formulated as a question, so that said, keeping it brief.
Speaker 14: My background might be a bit unusual. I'm the only algorithm officer I've ever met and on my resume at one point I claimed to take partial credit for online advertising or at least one way of measuring it, but from my sort of Antarctica of theoretical physics there, the problem can ultimately be distilled to if you measure a few things, if you take a few numbers instead of a whole live person, which is made of millions of numbers a second, and you start processing it, selecting it, that already produces these results, it's not necessarily an ism. It doesn't even have to exist in the human brain. The mere process of taking in data, having a metric and optimizing by itself can yield all of these awful results, which is mathematically chilling, but in some sense releases us from having to think there are a lot of evil bad guys actually wanting people to suffer.
Ruha Benjamin: No, so the first part of it in terms of the underlying reductionism, that's part of the process. I would agree and partly what I'm trying to sort of demystify is this idea that you need a racist bogeyman behind the screen. I'm trying to insist that even non technologically mediated racism is not necessarily animated by animus and that's the emphasis on indifference and the way in which the codification of various practices just by clocking in and out to institutions doing our job well, we can reproduce these systems and so on that last point, I would agree with you.
Ruha Benjamin: People looking for the boogeyman really trying to hinge the analysis on the intentionality to do harm. It doesn't serve us. It hasn't served us when we looked at old school kinds of structural racism and it doesn't serve us when we look at computer mediated structural racism, and so when we realize that it really is about thinking about what the underlying assumptions, even if the assumption is that reducing people to these data bytes is a good thing, that is an assumption.
Ruha Benjamin: It's prioritizing one thing over another, and so I think that we're on the same page, but it sounds like in some ways you could offer that comment as a way for us to throw our hands up and say, "Well, there's either nothing we can do about it or this is inevitable." Or you could offer that as a way to say, "Let us question those ground truths. Let us question whether we want to engage in this mass scale reductionism." And so I think that same insight could lead to two different pathways in terms of whether we want to do something about it or not, what that might be but I think on that point we agree.
Speaker 15: Thank you for your brilliant talk and I don't use that word very often.
Ruha Benjamin: Thank you. Thank you for your amening and throughout. I really appreciate that. I can always tell when the audience has more black people because there's like talk-balk throughout. Usually it's like no one laughs at the clip. I'm like, this clip, why aren't you laughing. So thank you.
Speaker 15: It's nice to hear from the amen corner. I have a really quick question and then a comment. The real quick question is that clip you show from what looked like a TV series or something.
Ruha Benjamin: Yeah, from a show called Better Off Ted.
Speaker 15: What is it called?
Ruha Benjamin: Better Off Ted.
Speaker 15: Better Off Ted.
Ruha Benjamin: ...and that episode is called racial sensitivity and it's playing off of the charge brought against people who bring up issues of racism that you're being too sensitive, but it's also thinking about the lack of sensitivity. The technology doesn't sense blackness in that clip and so you can find that online.
Speaker 15: Better Off Ted. Okay.
Ruha Benjamin: It's off the air now though. It was too-
Speaker 15: Too deep for that, huh.
Ruha Benjamin: It was too subversive.
Speaker 15: Okay. Well, I come in to this from a totally techno doofus standpoint. I'm an English major, retired journalist, singer, blah blah. Don't know tech from anything but I want to learn.
Ruha Benjamin: Awesome.
Speaker 15: ...but you know, I was sitting here thinking what, how wonderful it would be if the late great Octavia Butler could be sitting here right now.
Ruha Benjamin: You're talking my language.Speaker 15: ...watching this and I read a lot of science fiction as a kid. That's what got me through kind of a rough underpinning of growing up but I'm just... I don't even know what to say to your presentation because it's just so atomically deep but I'm going to be chewing off of this for a long, long time and I want to thank you for even us techno doofusis, which I might be a minority because I know you guys are in the school and I'm not, but you know, because the future is here, it's here right now, and I've been running around telling people about facial recognition ever since the ACLU did that study that showed all the members of Congress whose faces were thrown up as criminals, right, and I just feel like I'm older, I'm in the last trimester of my life, so I hope I'm around long enough to really see some of these things but it's like they're already here and the last thing I need to say is I keep a very low profile with social media, about as low as you could be and not be a CIA agent.
Speaker 15: I do occasional email, I don't do Facebook, I don't do any of that stuff. I know what it is and I talk to people all the time and I get clowned a lot, but I'm a retired journalist. I have a deep and abiding suspicion, always have about data collection and on my way here they were handing out pizza to young people on the quad who were willing to give up their data and I sort of checked that out and thought, do you guys know where that's going, what they're doing with that stuff? Anyway, I'm not really making a whole lot of sense but I'm just...you could speak to what I'm saying.
Ruha Benjamin: No, there's a lot. No, you are. It's perfect. There's so much there, so much I would like to reflect on and comment on. I mean, let's start with... I'm a student of Octavia Butler first of all and her work Afrofuturism speculative fiction really animates especially that last strand in the trailer the through line about understanding that imagination is a battlefield and so just having visited her papers at the Huntington Library a few months ago and seeing the way that she was engaging scholarly work on medicine embodiment and so on, but then deciding in her journals that the best way for her to actually seed a critical understanding of science and technology was through her novels and so she made a very deliberate choice.
Ruha Benjamin: She took... there's a syllabus there of her taking like a medical sociology class and then collecting all these headlines from newspapers about epidemics and then sort of pivoting to say, okay, she's going to take all of this and actually embedded it in her work as a creative and so certainly that continues to inspire me and influence me. On the issue of being a techno doofus, I mean partly what I'm trying to do with this work is to draw in a broader public people who don't necessarily identify as tech savvy because all of these developments are impacting everyone but only a small sliver of humanity is empowered to actually shape the digital and material infrastructure and I think that is really one of the things that we have to address so that you are my intended audience in that way and so I'm really glad you're here, but also in part I'm trying to question the very idea of what we think of as innovation.
Ruha Benjamin: I feel like that idea of who are innovators, what is innovation, has been colonized by a very small set of people and practices and so we think about the fact that black people are just alive and here today we have had to do a lot of innovation, technological and otherwise to be here and wonderful and thriving, right? And so in part what I want to say is that there's all kinds of social technologies, ways in which we have had to innovate in terms of living and surviving in this environment that is devalued and yet still appropriated in various ways.
Ruha Benjamin: I want us to question the whiteness of innovation, the whiteness of technology and to understand all of the ways that we have been part of that and question that and challenge that and that's also why I draw upon Du Bois and Ida B. Wells-Barnett and the last point about walking over and seeing the students getting the pizza, you know, I don't know many of you saw the headlines about Google last week. Having their contract workers target homeless people in Atlanta, specifically black homeless people in order to take facial images to diversify their facial recognition system because their new phone is coming out and so it's just... that whole set of decisions who was sitting around the table to say that's a good idea. Yeah. Go after homeless people.
Ruha Benjamin: Just thinking about one, the fact that they didn't have enough black people at Google working there to do it tells us something but also the fact that science and technology often been built on the backs of the most vulnerable population, and so the fact that that whole history from Marion Sims and gynecology to prison experiments, you know, Acres of Skin, if you haven't read that book.
Ruha Benjamin: Henrietta Lacks, Tuskegee. I mean, medical apartheid. The fact that we sort of have this collective amnesia about how vulnerable bodies have been the input for so much that in this case, the desire to build an inclusive product which on one level is a good development, right, we would think, okay so this research has come out to show that facial recognition is not that great at detecting people with darker skin and out of that grows a desire and an awareness and an intention to build an inclusive product but to get to that inclusive product, we have a coercive process, right? And so we need to think not just about the ends but the means and what's sacrificed in the process, and I would call that not just ethics, but the politics of knowledge production and technology development.
Ruha Benjamin: That's just one episode in this much longer history and I do think that we need to look at the way that students in particular are enrolled. I believe I saw some months back, a similar recruitment strategy in Miami or in Florida. They were going after college students and they were giving them similar like little gift cards for coffee or something in order to get their facial images. Again, that goes back to the part of the talk about really thinking about what my colleague Keeanga-Yamahtta Taylor recall, Predatory Inclusion, right? That inclusion is not a straight forward good and we should think about what are the systems we're being included in and not just seek out that inclusion but to be able to step back and question what we're being enrolled in and so there was a lot there.
Chad: Hello?
Ruha Benjamin: Hi Chad.
Chad: You have a great memory.
Ruha Benjamin: Yeah, I don't usually, but I remember. Yeah.
Chad: Thank you. I work at the Oakland Impact Center and we are developing a program to get at risk youth to learn how to code but piggybacking off the gentleman's concerns and understanding that we're kind of all indoctrinated into systematic oppression, it seems like even having young black coders doesn't seem like it would even help the problem, so what is a solution because it seems like it's an infrastructure problem like we have to erase systematic oppression to erase the systematic oppression in coding. What would you give as a solution?
Ruha Benjamin: You know, we've seen in the last few years a big push. Girls Who Code, Black Girls Code, Everybody Code and it's not to knock that, but it's to say that just building up technical capacity is, not even not enough but it can easily be co-opted and so I would say in this case, yes, train these young people to get these skills, but integrate into that not only the technical capacity, but the critical capacity to question what they're doing and what's happening, right.
Ruha Benjamin: To me it's not true empowerment and unless people can have the power to question how these skills are going to be used and so in any coding program, I would say it's really not enough to think that that in itself is a form of empowerment if you don't have the sort of social, cultural toolkit that goes along with the technical toolkit, right, and that goes for other... there's all kinds of camps in the summers for kids and all kinds of STEM fields and I would say that's true, not just of coding camps but all the other things in which like true education is about being able to shape your reality not just fit into the reality created by others and to be a cog in that machine.
Ruha Benjamin: Although I painted a very big picture that the problem is structural and contextual and like it's big, but what that means is that there's almost like nothing we can't do that can in some way contribute to questioning and changing it because the issues are so vast and coming to us from so many different directions, that means we can all find a way to plug in and to be able to redress this and deal with this. I hope that you don't walk out of here, none of you feeling like overwhelmed that there's nothing we can do. The take home is that there's everything that we can do. We just have to find our piece of it, right? And then link arms with others who are working in other areas. I hope that you feel more emboldened to find your part in that.
Speaker 17: Hi. An example of something going on on campus, so with gene editing, right? We have some people here and we're trying to cure sickle cell with the understanding that the demographic that has to deal with sickle cell primarily and we're engaging with conversations with them and understanding what do they think of the technology but even if it works, what do you do with a $1 million treatment? We're here creating it but they can't even afford it and so by doing good are we still propagating inequality?
Ruha Benjamin: Yeah. That is a question. That really it comes... my first book engage some of that we had... at that point, we weren't dealing with CRISPR yet, but whatever the newest techniques, genetic techniques are, the sickle cell community is always the first line in which people try to hone that new technique on this patient population. There's a long history of that. We look at Keith Wailoo's work and others and so it's good that you're already in conversation with communities, but I do think this larger issue of the health disparities and once something is developed, even if it is developed how people won't have access to it, so what is your responsibility in that or your research community responsibility in that?
Ruha Benjamin: I mean, partly in my more utopian moments, I feel like we all have our professional hats, but we all are also kind of members of this society and we should think about like what kind of advocacy work we can do as in use your legitimacy, use your sort of capital as researchers to advocate for this larger... whatever this larger kind of structural issues are that you know that the community that you're trying to help is not going to have access to this, so what does that look like?
Ruha Benjamin: One of... I was talking to a student earlier today, and one of the sort of examples of this to me that I think is a model for rethinking like who we are as researchers and academics is the movement called the White Coats For Black Lives and so this is a group of medical students across the country who are in medical school and realize that their medical schools aren't training them to be responsive physicians and healthcare practitioners because they do a very poor job of addressing issues of racism and equity in the medical school curriculum, and so they have linked arms and one of the things that they do among many is to issue report cards on their medical schools to say, you got a C, you got a D in terms of actually incorporating training to understand and mitigate the harms of racism in the health professions.
Ruha Benjamin: That's an example of students understanding that they're not just students and not just consumers of knowledge, but they have a responsibility to think about like, what is this profession that we're being trained into, and it kind of goes back to this example as well, but what does that look like in all of our little corners of research in academia and so in that case, what is the responsibility of people who are doing research meant to serve a particular patient community to only focus narrowly on the kind of scientific medical question without taking in some way engaging as a professional community on these larger issues that you're describing and I think there's just more we can do.
Ruha Benjamin: It's not a lot of incentive to do it, right. Like these students, White Coats For Black Lives, they're taking time out of the other things that they could be doing, but they understand the importance of it and so that's one example that I think can serve as a model for others.
Jasmine: Hi. My name is Jasmine. I'm an undergrad visiting scholar from Hong Kong. I'm really impressed by what you... like all of the information and your insights.
Ruha Benjamin: Thank you.
Jasmine: ...kind of sort of eye opening and it's interesting because yesterday I attend Berkeley forum. It's a guy who also talked about relationship between humanity and technology. For me to be honest, personally I feel a little bit overwhelming because like I am from a background of literature and sociology, which I have no knowledge for like not at all professional knowledge about technology and it's a common problem that most of the time we don't really aware of the potential risk and threats behind why we're using technology and I really appreciate that you raise it up. My concern is, because I'm going back to Hong Kong after-
Ruha Benjamin: Not much going on there.
Jasmine: Well, I can't tell in perspective for... as a Hong Kong citizen there. There are so many manipulation with social media and technology and at some point we just feel helpless because like there's nothing much we could do. I feel like... but now like after your empowerments, after previous inspiration that the take home is, we actually have something to do and it's just about how you personalize which smallest piece that you could only do, so I'm saying... I'm actually doing research related to police brutality and also criminal justice. I just kind of like want to seek your advice. Like how can I glocalize.
Ruha Benjamin: What's that word?
Jasmine: Glocalize. Like I don't know... glocalize, like globalized but glocalized, like bring something localized. I mean this is like a globalized thing. Like you kind of bring the partner perspective into somewhere narrow and then it's always worse, the best when it become localized. I'm just thinking about how can I bring this conversation into Hong Kong, somewhere like I'm more attached to.
Ruha Benjamin: Yeah.
Jasmine: Yeah, and I also once... if you have any tangible reminder for me as a poor student when I pass by some booth, they give you free gift or free pizza because I was starving when I came in here because I was too busy on my research and I do not have time to pick any lunch, so it's a great temptation for me in such a situation. How would you suggest me to remind myself, selling your data to someone. Just leave a piece of pizza, no.
Ruha Benjamin: Two things. To the last point, you can remember the well known line, there's no such thing as a free lunch and even when you think about the idea of having access to something, access to technology, access to free some... if you have access to something, it has access to you, right? And so if you think about just in the context of educational technologies, a lot of people are starting to realize that yes, putting a laptop in front of every kid, they have access to that but the technology also has access to all of this data that's also being fed and so just thinking about this two way, this line, and it's not a horizontal line in terms of these data-driven systems.
Ruha Benjamin: I'm thinking for example, about these growing sort of revolts, local revolts against this kind of access, so I think about these Brooklyn students, high school students who were sitting in front of this Facebook educational technology system for I don't know how long, and they only had like 15 to 20 minutes of time with a human teacher a week, and they walked out and they revolted, and there's a whole town in Kansas, for example, there was another system that this school board adopted and the parents, the students, everyone sort of rose up against this.
Ruha Benjamin: I think your intuition is right in terms of local movement and action and I would suggest one of the first things that you can do going back is to pick up a book if you haven't seen it yet, called Twitter and Tear Gas by Zeynep Tufekci. She's actually been in Hong Kong these last few months. She studies the role of social media and technology in these movements. She did a lot of work in the Arab Spring and in her native Turkey but now she's... and you can actually get the book but also follow her on Twitter because she's sort of live tweeting her reports from this and read that book and then as you're reading, think about how a lot of the concepts that she's developing there apply when you go back, so that would be what I would say for you and good luck. Yeah.
Speaker 19: Hi.
Speaker 20: Just yeah, thank you so much for this talk. I used to be a English professor and I started working with nonprofit organizations around these issues because I was getting depressed and feeling useless and one of the things that I loved about this talk was the way that you linked imagination and critical thought and gave a place for literature and artistic renderings of all of this and it just made me think, oh, maybe I can-
Ruha Benjamin: Absolutely.
Speaker 20: ...it is something that I can do back, and what I used to do. I mean, I'm happy doing what I do with the nonprofit organizations, but the other thing that I wanted to say and the question that I want to ask, and it's sort of your question, it sort of links with your question. I've been thinking a lot about privacy nihilism and as I have been working with young people and parents around screens in schools and there is that sense, like, well, they've already got me and so I just want to give up and so I feel now after your talk and I can't wait to dig more into your book. I feel like I'm going to be better equipped to address that attitude but I'm wondering, I wanted to ask you right now if there is anything that you could say to kind of get the wheels turning more fast, more quickly.
Ruha Benjamin: Yeah. Some of those examples I just mentioned in the last comment, but also I'm looking at my colleague, professor Desmond Patton sitting right there who's the expert on young people and families and technologies, so can you raise your hand, yay. He's actually speaking tomorrow if he can come back to campus, at what time? You don't know. Okay. We'll find out. Does anyone know? 4:00 P.M. That's okay but one of the things I learned through his work is that yes, nihilism might be part of it but also there is a design... like in terms of using technology and surveillance as a way to keep parents as thinking of it as a way to know things about their children, to keep them.
Ruha Benjamin: There's also a desire for it if the other alternative is to have no information or no feeling like you have no control and so I'm probably misrepresenting exactly what the takeovers are, but all I know is that it's... I think it's more complicated than just a binary between just kind of top-down surveillance and a kind of liberatory approach but there is a middle way in which people actually feel like they want to use these technologies to enact some forms of safety in data collection, and so I would say look up Desmond Patents work and maybe, I don't know if they're live streaming the talk or not, but this... I'm really thinking about the bottom up demand for certain ways to track in the name of security and safety, so that's not as helpful but that's my go-to. Do we have time for one more question. Anyone have the mic? Yes, go ahead.
Speaker 21: All right. I'm a computer science person. I don't spend... I haven't spent time really thinking about fairness, although I know that like some of my colleagues do including..., do you know Moritz Hardt?
Ruha Benjamin: I don't.
Speaker 21: Okay. He's on the faculty here. His office is in this building.
Ruha Benjamin: Awesome.
Speaker 21: They have this actually organization called FAT/ML, which is fairness, accountability and transparency. Transparency in machine learning and again, so I'm not an expert. I don't know if... is Moritz in this room right now? But they taught a course on some of the papers they'd been writing in the space and I guess one of the things that I saw from one of their lectures was, you can imagine trying to come up with a mathematical definition of like what it means for let's say an algorithm to be fair.
Speaker 21: Like you might say. Some of the pictures you showed the app failed to identify that a person was there, the dark skinned man, somehow that's unfair to black people. The app is unfair or where they give scores that are higher to one group that's unfair so they tried to find like, what does it mean to be fair and one of the definitions was, I think... I may get it wrong again, it's not my area of research, but predictive parody. What's the probability that you say they're going to default on the loan given that they're actually going to default on it? And what is that for white people, what is it for black people.
Speaker 21: Do you get the same kind of accuracy for both? And then another one was like false positive rate, was like they had another, and also the negative what's the probability that you say they're not going to default given they're not going to default and if you get that the same for different like groups like blacks and whites, let's say you say that it's fair for predictive parody and then there's like false positive parity, which is like do you have the same false positive rate for blacks and whites?
Speaker 21: And then the theorem is like, no mechanism can have both. It's like mathematically impossible for any machine learning algorithm to achieve both. That's a negative result. Of course people are still doing research in the area and that you can get some other positive results but when I see that, I guess I want to ask you as a sociologist like, what should they do like... or what would be a success?
Ruha Benjamin: Yeah, no, I think that's an important question and that question is actually arisen within the FAT* community in terms of the conference that they hold every year and so one of the things that's evolved is a questioning of those narrow definitions of fairness and so this year coming at the conference, they've developed a track that... I won't know exactly what the acronym stands for, but it's craft. It's a critique and a evolution of the conversation within that field to say that we can't just limit ourselves to trying to hone these very narrow definitions of fairness when the larger context that we're trying to model is so deeply unjust.
Ruha Benjamin: If you just take like the crime, the risk scores and if the attempt is to get... to make sure that the predictions match the rate of crime in the larger society as one of the particular forms of fairness, but that crime rate itself has been produced through racial profiling and policing then if the model is simply just mirroring the social reality, and by that definition you say, okay, we have a good model without questioning what the underlying crime rate that you're trying to match is, then that presents a very narrow field of action and so within the FAT*, FAT/ML community, there is now a number of proposals. I was a reviewer for it and a number of panels and proposals came in.
Ruha Benjamin: I'm trying to think about how people who are trained in the data sciences can actually contribute to a larger rendering of what fairness and equity is and many of those proposals that I looked at had built into it some form of community partnering. Whether it was an ACLU... working with the ACLU in one state or working with an organization in another state, it was understanding that you can't just define fairness with technical definitions.
Ruha Benjamin: I have a colleague at Princeton Arvind Narayanan who many of you may know who is... I think his book is out now or will be out where he has developed like 26 definitions of fairness and if you follow him, you can see again the same conundrum that if... you can't meet all of them, it's like a trade-off and so what that might mean is that rather than playing the game of like deciding which one of these definitions to step back from it and one of the proposals that I read was interesting that I will just sort of end with.
Ruha Benjamin: I don't know if we have time for another question, but had in... I think it was a team from Microsoft actually, but part of the proposal had to do with how do you equip people within organization or a company to when they see that whatever they're designing or building is likely to have X, Y, and Z harms or effects, how do you actually refuse or stop the process? And so rather than a trade-off like should we do this or that, maybe we don't build it at all and so really thinking about what refusal looks like and rather than just trading off between more or less harms is an interesting way to think about what it would mean to actually equip people, especially people who might be low on the totem pole in a various organization to actually be able to speak up and stop particular things.
Ruha Benjamin: That is a long way of saying that in the upcoming meeting at FAT*, you're going to have this track and opportunity to pose those questions and think about alternative ways to move forward. Yeah and that evolved from within the community, which I think is a good thing. Are we out of time? All right, thank you all. Take care.