New Texas partnership aims to define ethics in artificial intelligence

Systemic biases and racism can seep into the systems that power AI.

By Michael Marks & Gabriella YbarraAugust 8, 2022 1:56 pm,

It seems like every day there are new headlines about what artificial intelligence can do. However, a new research partnership at the University of Texas at Austin is more interested how it works. The initiative between UT-Austin’s Good Systems program and the MITRE Corporation, a nonprofit focused on solving big social problems, will focus on ethics in AI research and innovation.

Kenneth R. Fleischmann, a professor in the School of Information at UT-Austin and the founding chair of Good Systems, spoke to Texas Standard about making sure AI technologies contribute to the common good.

This transcript has been edited lightly for clarity:

Texas Standard: Can you give us an overview of the kinds of questions that are going to drive the research at this new partnership?

Kenneth R. Fleischmann: So, in the work that we’re doing, we’re starting with a definitional question: What does it mean for AI-based systems to be good? And of course, it’s not as good in absolute terms, but good for whom and what circumstances. It’s really important to understand the context of how AI-based systems will be deployed in reality, then how we would actually evaluate the goodness of those systems, again from a stakeholder perspective. And then finally, how we can build AI-based systems that are good, that align with the needs and values of society.

Can you give us some examples of how AI will affect the day-to-day lives of regular people in the future?

There are many cases where not only are our lives impacted by AI, but we may not even be aware of them. So, anytime someone applies to a big box store, to work for that company, that application may be reviewed automatically by the AI-based system before it’s reviewed by humans. It might actually result in the application being rejected by a system before a human actually gets to see that application. Similar to that, there have been some pretty prominent stories in the news of, for example, delivery workers being fired based on algorithms that are automatically assessing their job performance, defining it as inefficient, but not necessarily factoring all the real-world situations, such as it’s hard to deliver a package to an apartment complex if the main office of the apartment complex hasn’t opened yet.

A further issue is not only are these systems often inaccurate and often based on an incomplete understanding of the world, they’re often also based on a biased interpretation of understanding of the world. So, in many cases, these AI-based systems are trained on historical data, and that historical data is, in many cases, data that comes from a systemically racist and sexist society. And thus, the algorithms themselves may embed that racism and sexism and other forms of inequity into the predictions that they make. And so, this is a huge issue for society. If we’re not thoughtful and careful about how to address these inequities, we may be continuing the existing inequities or creating new ones.

How do you get those principles and best practices that are coming out of this research into the hands of people who are using this, like governments and people in the private sector?

In terms of the specific relationship with government, this has been a major focus of Good Systems. We’ve partnered with the World Economic Forum; we jointly hosted an event focusing on how governments can influence the design of AI-based systems through the process of purchasing decisions, like which AI-based systems to purchase or not purchase.

Definitely one of the hot-button items has been facial recognition systems. And so, some companies have made decisions to delay or cancel the deployment of these systems. And also, some governments have chosen to de-adopt facial recognition systems, given some of the biases that are embedded in them in many cases, as well as some of the problematic applications of those systems. So, those are some examples of how the purchasing power of government can have an impact and play a role.

If you found the reporting above valuable, please consider making a donation to support it here. Your gift helps pay for everything you find on texasstandard.org and KUT.org. Thanks for donating today.