Learn More About Desmond U. Patton
The algorithms that power artificial intelligence (AI) tools are built on a foundation of human judgement calls – but humans sometimes make mistakes. How can organizations confidently take advantage of the benefits associated with modern technology while avoiding negative impacts on surrounding communities?
“Human experts and tech experts need to work together,” says Desmond Upton Patton, the University of Pennsylvania’s Brian and Randi Schwartz University Professor (the highest professorship at Penn) and Penn Integrates Knowledge University Professor who also has joint appointments in the School of Social Policy & Practice, the Annenberg School for Communication, and in the department of psychiatry in the Perelman School of Medicine. He has also recently been elected to the National Academy of Medicine (NAM), one of the nation’s highest honors in the fields of health and medicine.
“Existing data science techniques cannot accurately understand key cultural nuances in language amongst predominantly communities of color. Our methodologies – which center and privilege culture, context and inclusion in machine learning and computer vision analysis – create non-biased and culturally nuanced algorithms to give tech companies a holistic perspective on various business and social issues. The companies that adopt these proactive measures are then able to ensure they are not unintentionally propagating bias.”
At the SAFELab, where Patton is the founding director, social workers and local residents join Patton and his team to add context to social media messages, which helps programmers build algorithms that interpret messages correctly. Community-based partnerships between social workers and technology developers reframe innovation to incorporate a fuller spectrum of humanity, creating a more useful, equitable and joyful environment. By nurturing these relationships, says Patton, who has been recognized as a 2022 Top 50 in Digital Health (Equity Advocates) honoree, organizations see their time and financial investments return as their solutions are adopted for best understanding customer needs. Tapping a diverse group to incorporate their expertise into data used by technical systems results in crucial knowledge and insights, ensuring that output will be used and embraced. This important approach earned him a place among the first cohort of 100 emerging changemakers selected to participate in the Obama Foundation’s Leaders USA program.
A pioneer in fusing social work, communications and data science, and the most cited and recognized scholar studying how groups constructed online can influence behavior offline, Patton helps organizations create processes that connect employees with customers, enabling their products to affect people more broadly. Through keynote presentations, interactive workshops and as an advisor to AI companies Kai.ai and Lifebrand, Patton helps organizations develop a better approach to diversity and inclusion that includes fairer practices to address the challenge of prejudice, rather than contribute to it.
How to Expand Your Product’s Audience
Historically, breakthrough technologies like AI or augmented and virtual reality have been exclusively wielded by data scientists or software engineers, but Patton says it’s time for that to change. He encourages computer and data scientists to move beyond a reflection-based mindset to drive true inclusion. Pointing out that many developers create products based on non-representative market research that excludes viewpoints from people of color, people with disabilities, LGBTQIA+ people and other populations, he recommends teams adopt reflexive thinking strategies that ask critical questions about the context of data. Patton’s practical recommendations for adding steps like naming, active listening and processing to developer training, then directly reaching out to underrepresented groups to incorporate their points of view, ensures their voices make an impact.
“Being able to talk to people like anthropologists, political scientists, community members and computer scientists matters for getting outside of restricting narratives,” stresses Patton, a former Fellow at the Harvard Kennedy School’s Carr Center for Human Rights. “You cannot, and should not, be an ethical engineer if you have not gone through a process in which you have to deal with your impact on the things you’re developing. If we can make that a requirement, then I think that we will slowly get to a space where people can at least be active in these conversations, willing to be checked, and to listen.”
Organizational Culture Tools for Responsible Innovation
Many companies make great efforts toward resolving cultural problems only to be left confused when their hard work doesn’t net results. According to Patton, this is often because they aren’t solving the right problem. By applying Patton’s qualitative analysis approach, which he’s already brought to companies including TikTok, Spotify and Microsoft, organizations can identify the different emotions people are experiencing, and the events that have triggered them. This provides a contextual understanding of peoples’ experiences, allowing the organization to create listening processes for understanding their employees, which can be turned into educational modules for a lasting and scalable effect on culture.
As AI tools continue to advance, Patton’s methods for responsible innovation that helps, and doesn’t harm, communities are becoming increasingly important. As the world grows in diversity, he helps leaders design products with diverse audiences in mind, understand how to use AI in ethical ways, and evaluate existing algorithms for biases and potential risks.
“Social work allows us to have a framework for how we can ask questions to begin processes for building ethical technical systems,” Patton says. “We need hyper-inclusive involvement of all community members — disrupting who gets to be at the table, who’s being educated and how they’re being educated, if we’re actually going to fight bias.”
Dr. Desmond Upton Patton is the Brian and Randi Schwartz University Professor and the thirty-first Penn Integrates Knowledge University Professor at the University of Pennsylvania. A leading pioneer in the field of making AI empathetic, culturally sensitive and less biased, he is the founder of the SAFElab, a social worker with the Department of Psychiatry and Behavioral Sciences at Children’s Hospital of Philadelphia and was previously the co-director of the Justice, Equity and Technology lab at Columbia School of Social Work. Also the former Associate Director of Diversity, Equity, and Inclusion, a past co-chair of the Racial Equity Task Force at The Data Science Institute and founder of the SIM|ED tech incubator at Columbia University, Patton’s research uses virtual reality to educate youth and policymakers about the ways social media can be used against them and how race plays a part.
Professor Patton’s early work attempting to detect trauma and preempt violence on social media led to his current roles as an expert on language analysis and bias in AI and a member of Twitter’s Academic Research advisory board. As a social worker, Patton created the Contextual Analysis of Social Media (CASM) approach to center and privilege culture, context and inclusion in machine learning and computer vision analysis.
In 2018, Patton’s groundbreaking finding, which uncovered grief as a pathway to aggressive communication on Twitter, was cited in an amici curiae brief submitted to the United States Supreme Court in Elonis v. United States, which examined the interpretation of threats on social media. Widely referenced across disciplines, Patton’s research at the intersections of social media, AI, empathy and race has been mentioned in The New York Times, Nature, The Washington Post, NPR, Vice News, ABC News and other prestigious media outlets more than seventy times.
Professor Patton was appointed Faculty Associate at Berkman Klein Center for Internet & Society at Harvard University, where he was named a 2017-2018 fellow. He won the 2018 Deborah K. Padgett Early Career Achievement Award from the Society for Social Work Research (SSWR) for his work on social media, AI and well-being. Patton was a 2019 Presidential Leadership Scholar and Technology and was a Human Rights Fellow at the Carr Center for Human Rights at Harvard Kennedy School.
Before joining the faculty at Penn, Dr. Patton was the Senior Associate Dean for Innovation and Academic Affairs at Columbia and an assistant professor at the University of Michigan School of Social Work and School of Information. He holds a bachelor’s degree in anthropology and political science with honors from the University of North Carolina at Greensboro, a Master of Social Work from the University of Michigan School of Social Work, and a doctorate in Social Service Administration from the University of Chicago.
Desmond Patton is available to advise your organization via virtual and in-person consulting meetings, interactive workshops and customized keynotes through the exclusive representation of Stern Speakers & Advisors, a division of Stern Strategy Group®.
As the University of Pennsylvania’s Brian and Randi Schwartz University Professor (the highest professorship at Penn) and the Penn Integrates Knowledge University Professor, Desmond Upton Patton is a leading pioneer fusing social work, communications and data science. At the SAFElab, where Patton is the founding director, he helps organizations create processes that connect employees, clients and customers so that they can learn how their products and tools affect people more broadly. As the most cited and recognized scholar studying how groups constructed online can influence behavior offline, his research has revealed how to collaborate with local populations when developing products and scalable solutions, incorporating their perspectives and insights to avoid unintended harms.
Patton’s consulting and executive education services focus on the intersection of tech, well-being and communication with a special concentration in how tech interacts with and affects communities of color. Organizations that have benefitted from Patton’s expertise in this area include TikTok, Spotify, Microsoft, Kai.ai, Lifebrand and AXON.