Learn More About Desmond U. Patton
The algorithms that power artificial intelligence (AI) tools are built on a foundation of human judgement calls – but humans sometimes make mistakes. How can organizations confidently take advantage of the benefits associated with modern technology while avoiding negative impacts on surrounding communities?
“Human experts and tech experts need to work together,” says Desmond Upton Patton, the University of Pennsylvania’s Brian and Randi Schwartz University Professor (the highest professorship at Penn) and Penn Integrates Knowledge University Professor who also has joint appointments in the School of Social Policy & Practice, the Annenberg School for Communication, and in the department of psychiatry in the Perelman School of Medicine.
“Existing data science techniques cannot accurately understand key cultural nuances in language amongst predominantly communities of color. Our methodologies – which center and privilege culture, context and inclusion in machine learning and computer vision analysis – create non-biased and culturally nuanced algorithms to give tech companies a holistic perspective on various business and social issues. The companies that adopt these proactive measures are then able to ensure they are not unintentionally propagating bias.”
At the SAFELab, where Patton is the founding director, social workers and local residents join Patton and his team to add context to social media messages, which helps programmers build algorithms that interpret messages correctly. Community-based partnerships between social workers and technology developers reframe innovation to incorporate a fuller spectrum of humanity, creating a more useful, equitable and joyful environment. By nurturing these relationships, organizations see their time and financial investments return as their solutions are adopted for best understanding customer need. Tapping a diverse group to incorporate their expertise into data used by technical systems results in crucial knowledge and insights, ensuring that output will be used and embraced.
A pioneer in fusing social work, communications and data science, and the most cited and recognized scholar studying how groups constructed online can influence behavior offline, Patton helps organizations create processes that connect employees with customers, enabling their products to affect people more broadly. Through keynote presentations, interactive workshops and as an advisor to AI companies Kai.ai and Lifebrand, Patton helps organizations develop a better approach to diversity and inclusion that includes fairer practices to address the challenge of prejudice, rather than contribute to it.
How to Expand Your Product’s Audience
Historically, breakthrough technologies like AI or augmented and virtual reality have been exclusively wielded by data scientists or software engineers, but Patton says it’s time for that to change. He encourages computer and data scientists to move beyond a reflection-based mindset to drive true inclusion. Pointing out that many developers create products based on non-representative market research that excludes viewpoints from people of color, people with disabilities, LGBTQIA+ people and other populations, he recommends teams adopt reflexive thinking strategies that ask critical questions about the context of data. Patton’s practical recommendations for adding steps like naming, active listening and processing to developer training, then directly reaching out to underrepresented groups to incorporate their points of view, ensures their voices make an impact.
“Being able to talk to people like anthropologists, political scientists, community members and computer scientists matters for getting outside of restricting narratives,” stresses Patton, a former Fellow at the Harvard Kennedy School’s Carr Center for Human Rights. “You cannot, and should not, be an ethical engineer if you have not gone through a process in which you have to deal with your impact on the things you’re developing. If we can make that a requirement, then I think that we will slowly get to a space where people can at least be active in these conversations, willing to be checked, and to listen.”
Organizational Culture Tools for Responsible Innovation
Many companies make great efforts toward resolving cultural problems only to be left confused when their hard work doesn’t net results. According to Patton, this is often because they aren’t solving the right problem. By applying Patton’s qualitative analysis approach, which he’s already brought to companies including TikTok, Spotify and Microsoft, organizations can identify the different emotions people are experiencing, and the events that have triggered them. This provides a contextual understanding of peoples’ experiences, allowing the organization to create listening processes for understanding their employees, which can be turned into educational modules for a lasting and scalable effect on culture.
As AI tools continue to advance, Patton’s methods for responsible innovation that helps, and doesn’t harm, communities are becoming increasingly important. As the world grows in diversity, he helps leaders design products with diverse audiences in mind, understand how to use AI in ethical ways, and evaluate existing algorithms for biases and potential risks.
“Social work allows us to have a framework for how we can ask questions to begin processes for building ethical technical systems,” Patton says. “We need hyper-inclusive involvement of all community members — disrupting who gets to be at the table, who’s being educated and how they’re being educated, if we’re actually going to fight bias.”
Dr. Desmond Upton Patton is the Brian and Randi Schwartz University Professor and the thirty-first Penn Integrates Knowledge University Professor at the University of Pennsylvania. A leading pioneer in the field of making AI empathetic, culturally sensitive and less biased, he is the founder of the SAFElab, a social worker with the Department of Psychiatry and Behavioral Sciences at Children’s Hospital of Philadelphia and was previously the co-director of the Justice, Equity and Technology lab at Columbia School of Social Work. Also the former Associate Director of Diversity, Equity, and Inclusion, a past co-chair of the Racial Equity Task Force at The Data Science Institute and founder of the SIM|ED tech incubator at Columbia University, Patton’s research uses virtual reality to educate youth and policymakers about the ways social media can be used against them and how race plays a part.
Professor Patton’s early work attempting to detect trauma and preempt violence on social media led to his current roles as an expert on language analysis and bias in AI and a member of Twitter’s Academic Research advisory board. As a social worker, Patton created the Contextual Analysis of Social Media (CASM) approach to center and privilege culture, context and inclusion in machine learning and computer vision analysis.
In 2018, Patton’s groundbreaking finding, which uncovered grief as a pathway to aggressive communication on Twitter, was cited in an amici curiae brief submitted to the United States Supreme Court in Elonis v. United States, which examined the interpretation of threats on social media. Widely referenced across disciplines, Patton’s research at the intersections of social media, AI, empathy and race has been mentioned in The New York Times, Nature, The Washington Post, NPR, Vice News, ABC News and other prestigious media outlets more than seventy times.
Professor Patton was appointed Faculty Associate at Berkman Klein Center for Internet & Society at Harvard University, where he was named a 2017-2018 fellow. He won the 2018 Deborah K. Padgett Early Career Achievement Award from the Society for Social Work Research (SSWR) for his work on social media, AI and well-being. Patton was a 2019 Presidential Leadership Scholar and Technology and was a Human Rights Fellow at the Carr Center for Human Rights at Harvard Kennedy School.
Before joining the faculty at Penn, Dr. Patton was the Senior Associate Dean for Innovation and Academic Affairs at Columbia and an assistant professor at the University of Michigan School of Social Work and School of Information. He holds a bachelor’s degree in anthropology and political science with honors from the University of North Carolina at Greensboro, a Master of Social Work from the University of Michigan School of Social Work, and a doctorate in Social Service Administration from the University of Chicago.
Desmond Patton is available to advise your organization via virtual and in-person consulting meetings, interactive workshops and customized keynotes through the exclusive representation of Stern Speakers & Advisors, a division of Stern Strategy Group®.
Developing Inclusive Cultures by Involving Local Community
Many companies make great efforts toward resolving cultural problems only to be left confused when their hard work doesn’t net results. According to Desmond Upton Patton, the Penn Integrates Knowledge University Professor at the University of Pennsylvania, this is often because they aren’t solving the right problem. At the SAFElab, where Patton is the founding director, his team brings together social workers and local residents to add context to social media messages, which helps programmers build algorithms that interpret the messages correctly. By tapping a diverse group to incorporate their expertise into data used by technical systems, teams gain crucial knowledge that ensures the products they’re developing will be used and adopted. Patton’s process for rigorously identifying problems turns active listening into educative models, ensuring all voices are fully heard and recognized. This partnership between technology and community reframes innovation to incorporate a fuller spectrum of humanity, creating an environment that’s more useful, equitable and joyful.
How Data Scientists and Social Scientists Can Partner to Combat Bias In a Digital World
From racial profiling through facial recognition software used by law enforcement to algorithms that unfairly target Black and brown users with subpar services and subprime financial practices, biases in the way technology is developed and used are rampant and must be intercepted before they do further damage. The Metaverse – a new combination of emerging technologies including artificial intelligence (AI), extended reality (XR), and blockchain – will create a virtual world for various aspects of human life, from social connections to e-commerce. But will the same algorithmic biases of our current technology carry over to a virtual world? Drawing from his extensive research and fieldwork, Dr. Desmond Upton Patton, Penn Integrates Knowledge University Professor at the University of Pennsylvania and founder of SAFElab, explains how data scientists, designers, and engineers can prevent bias in AI by collaborating with social science professionals who can help them become more aware of the unintended harms being built into technologies they develop. He also highlights the many ways organizations, developers, users, and society stand to benefit from such collaboration. This important presentation is geared toward organizations, professionals, and policymakers involved in the development and regulation of emerging technologies.
Ethically Engaging Artificial Intelligence
How can an organization be sure the artificial intelligence products and tools they create are helping communities, not harming them? Break your tech teams out of restricting narratives, says Desmond Upton Patton, the Penn Integrates Knowledge University Professor at the University of Pennsylvania and the former co-director of the Justice, Equity and Technology Lab at Columbia School of Social Work. As the most cited and recognized scholar studying how groups constructed online can influence behavior offline, Patton’s research reveals the importance of collaborating with local populations when developing products and scalable solutions, incorporating their perspectives and insights to avoid unintended consequences. He grounds developer training programs in practical recommendations for steps like naming, active listening and processing, ensuring that perspectives can be accurately incorporated. He then suggests computer scientists spend time with anthropologists, political scientists and other community members to learn about their experiences and to ensure their voices make an impact. This effort saves organizations from lost time and money spent on products and services that don’t get adopted because they fail to serve customer needs. By bringing tech experts closer to the user experience with Patton’s scalable advice, organizations will find their products better serve their consumer community.
Building a Healthier Social Media Ecosystem
How would the world look if social media encouraged healthy relationships? Desmond Upton Patton, the Penn Integrates Knowledge University Professor at the University of Pennsylvania, as well as the founding director of the SAFElab, took an interest in social media when he realized how much online animosity drove violence in disadvantaged communities. Now he is a leading expert on contextualizing social media conversations and designing algorithms that detect trauma, pain and the types of exchanges that can potentially lead to conflict. As a key advisor for Kai.ai, who develops artificial intelligence companions that help with mental health, Patton shows technical teams how to incorporate his research in a way that increases well-being and joy when creating new platforms. Leaders and teams will learn how to be more empathetic in the digital space, promoting healthier online activity among employees, customers or entire communities.
Culturally Responsive Tools for Managing Stress
The world presents new stressful experiences every day – just turn on the news. Desmond Upton Patton – founder and director of SAFElab – is a pioneer in the field of making AI empathetic, culturally sensitive and less biased, and he’s helping organizations create and deploy culturally responsive tools for managing burnout and anxiety. In his work with well-being platform Kai.ai, he’s identified that people are better able to manage their stress if they have access to a diverse array of culturally responsive tools that incorporate unique experiences. By centering diverse voices, Patton says, we can create an environment where inclusion is both the starting point and the goal. Practical and straightforward, Patton explains why people must be proactive about making others feel included, then unveils tools for developing empathy and sensitivity. This helps organizations foster a culture of collaboration, innovation and inclusion no matter what’s happening in the world.
Fighting Bias in FinTech
Fighting bias is important everywhere, but it’s especially critical for financial institutions to combat discrimination at every opportunity. Desmond Upton Patton, the Penn Integrates Knowledge University Professor at the University of Pennsylvania and the former co-director of the Justice, Equity and Technology Lab at Columbia School of Social Work, is an expert in bringing communities and companies together to build impactful and mutually beneficial programs. Having noticed that many organizations create tools based on market research that doesn’t accurately represent populations, Patton brings leaders and teams practical advice and recommendations for partnering with communities that are currently underrepresented. His recommendations break down the true partnership between institutions and the people that engage with them, showing leaders and teams how to have conversations with communities of color before designing solutions. By advising organizations to pause and reflect on their actions, Patton guides financial firms on how to engage in trustworthy practices that resonate, earning them loyalty from customers and employees alike.
How Technology Developers and Social Scientists Can Work Together to Combat Bias in the Metaverse
From racial profiling through facial recognition software used by law enforcement to algorithms that unfairly target Black and brown users with subpar services and subprime financial practices, biases in the way technology is developed and used are rampant and must be intercepted before they do further damage. The Metaverse – a new combination of emerging technologies including artificial intelligence (AI), extended reality (XR), and blockchain – will create a virtual world for various aspects of human life, from social connections to e-commerce. But will the same algorithmic biases of our current technology carry over to a virtual world? Drawing from his extensive research and field work, Dr. Desmond Upton Patton, Penn Integrates Knowledge University Professor at the University of Pennsylvania, founder of SAFElab and co-director of Columbia’s Justice, Equity and Technology Lab, explains how data scientists, designers, and engineers can prevent bias in AI by collaborating with social science professionals who can help them become more aware of the unintended harms being built into technologies they develop. He also highlights the many ways organizations, developers, users and society stand to benefit from such collaboration. This important presentation is geared toward organizations, professionals and policy makers involved in the development and regulation of emerging technologies, including AI, AR, VR and XR.
Going Back: Loss, Empathy, and Inclusion in the New World of Work
Working from home during the pandemic presented employees, educators and students with unique challenges. Going back in some form will present others. Whether people return to an office or school full-time, part-time or remain fully remote, people will be operating in a new world of work, one colored by a universal sense of loss and longing for life before lockdown. In this talk, Dr. Desmond Upton Patton – founder of SAFElab, co-director of Columbia’s Justice, Equity and Technology Lab and a pioneer in the field of making AI empathetic, culturally sensitive and less biased – teaches participants how to identify and respond to signs of grief and stress in a colleague, whether they are operating in a digital or in-person workspace. He will also teach tools for developing empathy and sensitivity and explain why people must be proactive about making others feel included. His work goes a long way toward helping organizations foster a culture of collaboration, innovation and inclusion no matter where people are working, teaching or learning.
Technology and Ethics: How to Consciously Build Your Brand
Everything is smart these days. Smart apps. Smart cars. Smartphones. Smart products. While it all seems genius, most video game, app and technology developers are unaware of the biases they build into programs. Representations of characters, actions and settings often stem from preconceived notions of “reality,” putting creators in the dubious position of unknowingly sending the wrong message to today’s users, many of whom will carry those impressions into the future.
Developers have a powerful opportunity to create positive and lasting change by “creating more consciously” and Desmond Upton Patton – founding director of SAFElab, co-director of Columbia University’s Justice, Equity and Technology Lab, and Penn Integrates Knowledge University Professor at the University of Pennsylvania – would like to show them how, and why it’s important. Patton is a sought-after speaker, teacher, advisor and advocate to private and public sector organizations looking for guidance on how to ethically and consciously build brands, causes or policies that better serve society at every level. Having worked with companies like Microsoft, Spotify, Twitter and Facebook, Patton explains the biases currently programmed into the algorithms of devices people use every day. He shines a light on their flaws and outlines a framework for consciously designing bias-free technologies, some of which can potentially solve big world problems. He also discusses the ethics behind contact tracing and how that should not be our only tool for identifying virus-infected citizens.
How to Foster Diversity and Inclusion
The benefits of diversity and inclusion are commonly touted by organizations, that then go on to pay little attention to them. Too often, organizations believe they have fulfilled the requirements of diversity by hiring a certain number of minorities and women but fail to create the underlying cultural conditions for allowing those minorities and women to feel comfortable and thrive. In this presentation, Dr. Desmond Upton Patton draws on his background in social work and designing empathetic, culturally sensitive algorithms to analyze social media content to help organizations develop a better approach to diversity and inclusion. He reveals how language and data taken out of context can fail to anticipate conflict – both online and in the workplace – and how algorithms and datasets can be developed that address these issues. Crucially, Patton urges organizations to see diversity not as a self-sustaining condition but as something that needs to be continually nurtured through dialogue and the input of people from different perspectives and from differing backgrounds.
Bias in AI – The Next Battle for Equity
Artificial intelligence is, well, intelligent. But that does not mean AI is fair. AI algorithms commonly make biased decisions adversely affecting women and people of color, on matters ranging from credit allocation to prison sentencing. This is because AI relies on datasets that unconsciously teach machines to replicate injustice, fail to understand cultural nuance or context, and are designed primarily by white men, who often lack the ability to anticipate negative impacts on others. In this presentation, Dr. Desmond Upton Patton draws on his own experiences challenging AI bias to show how organizations can help defeat this emerging social problem. As AI becomes more crucial to decision-making across industries, those who care about equity, fairness and justice will have to take notice of how bias is being unconsciously promoted by technology. On this front, Patton is a trailblazer, expert and foremost authority.
Building a Healthier Social Media Ecosystem
Social media – once promoted as a global unifier – has become a driving force in dividing people and spreading hatred. Online vitriol has reached new heights, and now actively contributes to – rather than simply reflecting – intense political, cultural, racial and religious divisions throughout the world. Dr. Desmond Upton Patton, a social worker, took an interest in social media when he realized how much online animosity drove violence in disadvantaged communities. Now he is a leading expert on contextualizing social media conversations and designing algorithms that detect trauma, pain and the types of exchanges that can potentially lead to conflict. In this presentation, Patton shows how he has successfully developed ways to identify unhealthy social media discourse and use, and counsel individuals to be more empathetic. These lessons can be used by organizations of all types to promote healthier online activity, whether among employees, customers or entire communities.
A Practical Approach for Addressing Bias in Artificial Intelligence
November 19, 2022
Gun Violence in America: Violence is Contagious
June 2, 2021
Grow Your Business in 2021 By Leveraging Diverse Thinking
January 6, 2021
Ethics and Artificial Intelligence (Audio)
December 16, 2020
Why Chicago's Mayor Should Reconsider Social Media Marketing
August 17, 2020
Social Work Thinking for UX and AI Design
July 18, 2020
5 Questions on Data and Context with Desmond Patton
February 21, 2020
Desmond Upton Patton: "Contextual Analysis of Social Media" (Audio)
February 21, 2020
Desmond Patton on Social Media and Gang Violence (Audio)
December 10, 2019
NYU Conference on Race and Technology (Video)
May 28, 2019
Why AI Needs Social Workers and "Non-Tech" Folks
March 24, 2019
Internet Banging with Dr. Desmond Patton (Audio)
August 30, 2015
Social Work in Metaverse: Addressing Tech Policy Gaps for Racial and Mental Health Equity
(Internet Policy Review, February 2022)
Contextual Analysis of Social Media: The Promise and Challenge of Eliciting Context in Social Media Posts with Natural Language Processing
(Association for Computing Machinery, February 2020)
Addressing the Inappropriate Use of Force
(International Society for Research on Aggression, April 2021)
Expressions of Loss Predict Aggressive Comments on Twitter Among Gang-involved Youth in Chicago
(npj Digital Medicine, March 2018)
Stop and Frisk Online: Theorizing Everyday Racism in Digital Policing in the Use of Social Media for Identification of Criminal Conduct and Associations
(Social Media + Society, September 2017)
Parenting in a Digital Age: A Review of Parents' Role In Preventing Adolescent Cyberbullying
(Aggression and Violent Behavior, June 2017)
"Police Took My Homie I Dedicate My Life 2 His Revenge": Twitter Tensions Between Gang-Involved Youth And Police In Chicago
(Journal of Human Behavior in the Social Environment, January 2016)
Internet Banging: New Trends in Social Media, Gang Violence, Masculinity and Hip Hop
(Computers in Human Behavior, January 2013)
As the University of Pennsylvania’s Brian and Randi Schwartz University Professor (the highest professorship at Penn) and the Penn Integrates Knowledge University Professor, Desmond Upton Patton is a leading pioneer fusing social work, communications and data science. At the SAFElab, where Patton is the founding director, he helps organizations create processes that connect employees, clients and customers so that they can learn how their products and tools affect people more broadly. As the most cited and recognized scholar studying how groups constructed online can influence behavior offline, his research has revealed how to collaborate with local populations when developing products and scalable solutions, incorporating their perspectives and insights to avoid unintended harms.
Patton’s consulting and executive education services focus on the intersection of tech, well-being and communication with a special concentration in how tech interacts with and affects communities of color. Organizations that have benefitted from Patton’s expertise in this area include TikTok, Spotify, Microsoft, Kai.ai, Lifebrand and AXON.
“Dr. Desmond Patton’s participation in our company’s team building event was immensely impactful. In response to the events of 2020, Vinli held a panel discussion workshop on the theme of “Identity, Expression and Bias”. Dr. Patton’s expertise and thoughtful commentary were instrumental in driving our exploration, helping us to reveal and question many assumptions on how we interpret interactions with people, events, and data in general. And as a software company in the big data space, we found that Dr. Patton did a phenomenal job of creating real connections between his social work research and themes Vinli staff encounter in our own work. For those in search of insightful, enlightening discourse experiences for your organization, he is an excellent choice of partner. Dr. Patton was a pleasure to work with, and we highly recommend him for your next event.”
“Dr. Desmond Upton Patton was the keynote speaker for the Jane Addams College of Social Work 10th Annual Training Institute for School Social Work Professionals. His presentation “Impact of Social Media on Youth” was innovative and inspiring. He explored the intersection of youth, gang violence and social media, as well as the use of cutting-edge technology to understand the ways youth express themselves on social media. His research in this area is both pioneering and proactive, and can shape the future of mental health practice in schools. He is a presenter that should be included on every school district’s in-service training agenda. You will not be disappointed with his outstanding presentation style, and the amazing content that he shares.”
“For us at Facebook, Prof. Patton is a valued partner and subject matter expert. An important challenge in our work of content review is understanding local context. We want to enforce our policies – to decide what content to leave on the site, and what content to remove – based on the best possible understanding of circumstances “on the ground.” By sharing his insights on youth in Chicago, Prof. Patton has provided us with a useful laboratory for thinking about how to interpret local patterns of speech. We hope to work with him to apply these lessons in other areas.”
“The University of Illinois in Urbana-Champaign invited Dr. Patton to share his work as part of our inaugural seminar series called, 'Harnessing Technology for Social Good.' Dr. Patton agreed to be a part of the seminar right away. His talk was the last scheduled seminar for the year and one of the most highly anticipated talks given the topic. Without question, Dr. Patton delivered. His talk blew the audience away. I was particularly struck by how his presentation captivated both the minds and the hearts of people in the audience, which in my mind was no easy task because he was speaking to academic faculty and 8th graders from the Southside of Chicago. Dr. Patton did a terrific job of engaging everyone in the room. At the end of the talk, Dr. Patton met with the students 1:1 and talked about his career trajectory and pathway to academia, again captivating young people in ways that was inspiring. I would highly recommend Dr. Patton as a speaker.”
“The Hunter College (CUNY) partner site of the Summer Institutes in Computational Social Science (SICSS) recently invited Dr. Patton to serve as a guest speaker at our location. The audience was mainly comprised of doctoral students from across the social and health sciences, from over 8 top-ranked universities across the country, and we discussed the myriad issues involved in using social media data for social science inquiry. Dr. Patton’s real-world examples of the complexity of social media posts, particularly from marginalized communities, and his compelling and clear descriptions of best analytic practices was instrumental in shaping our participants’ understanding of ethical approaches to the analysis of digital trace data. Our audience was just being introduced to these data and methods, and Dr. Patton’s clear, concise, and plain language facilitated their engagement with these topics in a concrete fashion. Dr. Patton was highly responsive and exceeded our requirements in every regard. In short, Dr. Patton is a tremendous public scholar, and it was our pleasure to work with him. I would highly recommend Dr. Patton as a speaker.”