The Social Life of Algorithmic Harms (learnings & findings)
The Social Life of Algorithmic Harms Workshop + From my filter bubble
In March, I participated in Data & Society workshop The Social Life of Algorithmic Harms hosted by Jacob Metcalf and Emanuel Moss.
What is the social life of algorithmic harms?
Algorithms (pairing of data sets with processes) increasingly shape our social life. In doing so, they also produce various categories of harms and undesired consequences. The aim of this workshop was to practically organize algorithmic harms so that they can be politically and judicially regulated.
The workshop was focused on the social life of algorithmic harms which means it takes as its premise the idea that we only know a small proportion of the ways algorithmic systems negatively impact the world, individuals, communities.
Keynote speakers on the panel were Ali Alkhatib, Margot E. Kaminski and Tawana Petty. Key takeaways of the panel:
How to conduct research on algorithmic harms? “Be contentious about the power you bring into the conversation, be mindful and listen.” - Ali Alkhatib
“Laws that are implemented to protect from algorithms rely on a collective approach. However, to erase the individual harm from the picture is a non-starter.” - Margot Kaminski
(my favorite) “We are in an early stage of fairness and accountability and need more focus on what are the experiences of people. Put less focus on algorithmic bias and more focus on algorithmic harm.” - Ali Alkhatib
“We need to construct legal apparatus that would empower us to have more control in how algorithmic systems impact our lives.” - Margot Kaminski
“Sometimes questions are more important than answers. We need a slow conversation on what needs to exist in the world.” - Tawana Petty
What the panel once again proved to me is that we are in a “golden age” of qualitative research methods (be contentious and listen, questions are more important than answers…).
YouTube video of keynote talks:
As a workshop participant, I also had to review two fantastic works in the making. One is a Ph.D. chapter and I am not allowed to share it. The other is a forthcoming paper by Elana Zeide. Reading the paper made me want to delete my LinkedIn account and other social media. The main argument of the paper is that algorithmic systems create a “silicon ceiling”, an invisible barrier to opportunity. You can read more about the paper below:
The Silicon Ceiling: How Algorithmic Assessments Construct an Invisible Barrier to Opportunity by Elana Zeide
Schools increasingly use artificial intelligence in instruction. Personalized learning systems take on a whole host of other educational roles as well, fundamentally reconfiguring education in the process. They not only perform the functions of “robot teachers,” but make pedagogical and policy decisions typically left to teachers and policymakers. Their design, affordances, analytical methods, and visualization dashboards construct a technological, computational, and statistical infrastructure that literally codifies what students learn, how they are assessed, and what standards they must meet. Educators and legislators can no longer afford to overlook the pedagogical and policy implications of their technology choices.
From my Filter Bubble
Roxane Gay’s essay on why the notion of “thick skin” is problematic: Jada Pinkett Smith Shouldn’t Have to ‘Take a Joke.’ Neither Should You.
“Instead, this is a defense of thin skin. It is a defense of boundaries and being human and enforcing one’s limits. It is a repudiation of the incessant valorizing of taking a joke, having a sense of humor. It is a rejection of the expectation that we laugh off everything people want to say and do to us.”
Who Is the Girlboss Now? Article by The Cut
“On TikTok, the incandescent girlboss that millennials identified with is now a ghost, the poster child of a bygone era of pop feminism”
Academic Article by Kelley Cotter titled Practical knowledge of algorithms: The case of BreadTube:
A podcast conversation with Emma Chamberlain in which the YouTuber gives many fantastic insights into the creator economy: “It’s like a disease, your boss is public opinion.”
Illustrations by Gulia Hartz:
This news made my day: Taylor is receiving a Doctor of Fine Arts, honoris causa from NYU, and will be speaking at the commencement for the Class of 2022! Long story short, she survived. Congrats, Taylor! 🎓🎉 #NYU2022
Twitter, of course, responded to the news in a quite hilarious way: