Slow Down, Listen, Disrupt Bias, Repeat with Dr. Desmond Upton Patton
Listen on Apple | Listen on Spotify
In This Episode
In this episode of When Bearing Witness, Dr. Desmond Upton Patton invites us into a conversation about humility, listening, and the ethics of digital storytelling. He shares what he has learned from years of research at the intersection of social work, technology, and racial justice.
And he doesn’t stop at analysis, he challenges us to examine our own assumptions and shows how deep listening can disrupt bias and reshape how we understand digital expression.
A researcher, social worker, and ai ethicist, Dr. Patton explores how grief is often misread as aggression, and how those misreadings can escalate both online and offline violence. But the solutions, he reminds us, are not just technical, they are relational. He offers a model of trauma-informed storytelling rooted in cultural humility, reflexivity, and active listening.
We all bring our own lens to storytelling. But to do this work ethically, especially in nonprofit storytelling and storytelling for social impact, we have to slow down and ask hard questions: Am I the right person to tell this story? What information is missing? Who do I need to engage with, and how do I do that in a trustworthy and ethical way?
About Dr. Desmond Upton Patton
Dr. Desmond Upton Patton is a leading expert at the intersection of social work, technology, and racial justice, known for his groundbreaking research on how social media impacts grief, trauma, and violence in communities of color. A Penn Integrates Knowledge University Professor with appointments across social policy, communication, and psychiatry, he developed the Contextual Analysis of Social Media (CASM) to address bias in AI by centering cultural nuance and lived experience. His work has shaped national conversations on digital violence and empathy, informed tech safety policies at companies like Twitter and Spotify, and been featured in The New York Times, Nature, and NPR.
Connect with Dr. Desmond Upton Patton
Connect with Maria
Speaking & Training | LinkedIn | Email
Transcripts
Maria: Welcome back to When Bearing Witness. Today, I'm so excited and honored to be joined by Dr. Desmond Upton Patton. He's truly a trailblazer at the intersection of social work, AI, and digital communications. Dr. Patton’s groundbreaking research explores how social media influences mental health, trauma, and violence, particularly in communities of color. He investigates how online spaces can fuel harmful behavior offline.
This is such important research. I'm so excited to have you here to talk about grief, violence, and digital spaces. Dr. Patton, welcome to the show.
Desmond: Thank you so much for having me.
Maria: You have spent years studying how social media impacts wellbeing and behavior, especially for folks of color. I’d love to know a little bit more about your story and what drew you to explore this intersection.
Desmond: It started during my PhD program. I went to the Crown Family School of Social Work, Policy, and Practice at the University of Chicago. I initially went there to study urban education. One of the great things my advisor did was place me in a research center, the Consortium on Chicago School Research, where we were studying the transition to high school.
We wanted to understand how high-achieving eighth graders’ connection to education shifted once they moved from their K–8 schools to high school. I was on the qualitative team, visiting schools and having conversations with young people from the South and West Sides of Chicago.
One of the most interesting things was how difficult it often was to interview students because they wouldn’t come to school. Gun violence was a major reason. Students were either victims themselves or had to traverse invisible gang boundaries, often losing siblings, friends, or loved ones. This happened over and over again.
I became really interested in how those contextual factors impact how young people think about their education. That shifted my work to looking at the impact of community violence. For my dissertation, I was interviewing high-achieving Black boys and young men at a charter high school in one of the most violent neighborhoods in Chicago. I wanted to understand how they maintained 4.0 GPAs while navigating that environment.
Something that kept coming up was social media—particularly Twitter at that time. It was a key space for identifying safety, learning which parties to attend, who would be there. They were essentially geocoding safety in their school and neighborhood. That experience made me start thinking about social media not as a virtual platform, but as a neighborhood.
Maria: I want to dig deeper into a particular study—one of the most cited on this topic. We often think of social media as just online, but your work shows what happens offline too. You researched the relationship between social media and gang violence, and how grief can be expressed through aggression. Tell us more about this study and its implications.
Desmond: Around 2013, I began looking more deeply into the connection between social media and gang violence. I co-authored a paper with classmates from Chicago on a concept we called "internet banging," a play on gang banging. It referred to the unintended uses of social media that resemble gang-related activity.
That paper led to a series of qualitative studies. We began interviewing young people in Chicago who were impacted by gun violence or were gang-involved. We wanted to understand how they navigated social media in ways that sometimes led to offline violence.
We learned a lot about language, haunting, and the emotional dynamics of seeing people on these platforms. We also started using this context to train AI tools to identify language that might escalate into violence.
But what we found when analyzing language, emojis, and hashtags was that a lot of young people were expressing grief. They posted pictures, texts, and lyrics to honor people they had lost. These posts often came first.
Then someone from a rival gang—or even someone from the same neighborhood—might disrespect that post. They might mock it or call it out negatively. Slowly, the language of the original poster would shift. It would start as grief, then become more aggressive. We saw this pattern again and again in the data, often within a two-day window.
Maria: And would this sometimes show up offline too?
Desmond: Yes. The arguments often start online and then become the trigger for offline violence. By the time someone is harmed in a school or neighborhood, the argument already happened online. But the challenge for violence interrupters—people working on the ground to keep neighborhoods safe—is that they don’t have access to those online arguments. So it’s hard to deescalate something you don’t know has started.
Maria: How have folks used this research to actually help deescalate violence? What’s your hope or vision?
Desmond: The original hope was to create automated tools that violence interrupters could use to flag risky conversations online. But we ran into problems with bias. The young people we work with often use African American Vernacular English (AAVE), slang, and shorthand. The AI tools frequently misinterpreted this language—often mistaking grief for aggression.
That created a serious ethical dilemma. I worried it could lead to another form of incarceration by surveilling Black communities and misreading their expressions. So we ultimately didn’t move forward with the tool.
What we’ve seen instead is a growing awareness of how social media can escalate violence. Community-based interrupters have started using their own knowledge of language and culture to intervene in online arguments before they become offline violence.
Maria: That’s actually a huge deal—that you invested so much in this solution and still had the courage to say, “This might cause more harm than good.” It shows deep integrity. Do you think we’ll ever get to a point where AI can be helpful here, or does language evolve too quickly?
Desmond: It’s a great question. I do think AI can help, but only if we change how we ask questions and how we train the models. AI has helped me understand root causes of violence that play out on social media. We can use simulated environments—without real people—to model how language patterns develop.
There’s no question that serious harm can start on these platforms. Not just in Chicago or among Black youth. This happens across society, at every level.
But we need interdisciplinary teams. We need community input. We need ethical frameworks. Too often, AI projects are developed quickly, by people who don’t understand the communities they’re affecting, just because it's profitable.
Maria: Two themes that keep coming up in your work are empathy and slowness. So many storytellers work in online communities they’re not part of. What can they do to cultivate empathy and slow down?
Desmond: Active listening and reflectivity are key. For me, listening means recognizing where my expertise ends and someone else’s begins. I can’t lead with assumptions. I have to pause and acknowledge the gaps in my understanding.
I learned this from following a young woman on Twitter, Ja’Chira B. I had so many assumptions about her and what her posts meant. But time and again, she challenged what I thought I knew. As a social worker, I had to reflect on my own positionality—being a Black gay man, an endowed professor. My identities shape how I interpret posts and how I use them.
We all do this. We all bring our own lens. But to do this work ethically, especially as storytellers, you have to slow down. You have to ask hard questions: Am I the right person to tell this story? What don’t I know? Who should I partner with?
Maria: When I think back to 2008, it felt like the internet was already moving fast. Now it’s light speed. But if we’re going to manifest something better, what do you hope for in terms of safe, inclusive, and even joyful digital spaces?
Desmond: I recently saw a video on TikTok of two Black brothers. One was struggling with depression, and the other surprised him with a visit. They hugged for what felt like forever. It was so powerful—raw, healing, loving.
That’s the kind of experience I want more of online. People using these platforms to build connection, share joy, support mental health. But to get there, we need platforms designed with us, not for us.
Right now, many systems are optimized for outrage and conflict. That’s what drives clicks. But people don’t just come online to consume trauma—we’re here for joy too. We need to demand digital environments that reflect the way we want to live.
Maria: Dr. Patton, I’m so grateful for your wisdom and presence. How can people support your work and stay in touch?
Desmond: You can follow me on LinkedIn or BlueSky. My handle is Dr. Desmond Patton. My book, Facing Ja’Chira: Life and Death in the Digital Streets of Chicago, will be out next year. Until then, I’ll be sharing updates on those platforms.
Maria: Thank you so much, Dr. Patton. This has been such a joy.
Desmond: Thank you.