Is technology inheriting our biases?

Developments in artificial intelligence (AI) and immersive experiences are paving the way for exciting possibilities in healthcare, business, entertainment and education. But are these technologies inheriting our negative and institutional biases?

To explore that question, AT&T and Ericsson co-hosted a private screening of the documentary bias before a packed crowd at Landmark’s Aquarius Theater in Palo Alto. After the film, Alka Roy, product and technology leader at the AT&T Foundry, moderated a panel about the implications of unconscious bias and how technology can either perpetuate or limit our negative biases. Panelists included:

“This is one of the biggest collective challenges of our times.”

Alka Roy, AT&T Foundry

“This is one of the biggest collective challenges of our times,” Roy said, explaining the impetus behind the event. “How we think about fairness, privacy and inclusion when building products and intelligent systems using AI affects our lives and the world we build for our future. We can’t afford not to have this conversation.”

The event kicked off with the screening of the documentary bias, by award-winning Director and Producer Robin Hauser. The film follows Hauser on a journey to uncover her hidden biases, revealing how unconscious bias defines relationships, work settings, our justice system and even technology.

What can we do about bias?

The film examines what we can do about negative biases at the workplace, in the products and tools being built with AI, and ultimately in our brains. As the panel revealed, the solution is not cut and dried and perhaps needs to start with the obvious question: Are we motivated to remove bias?

“We know diverse teams have higher profitability and do better.”

Robin Hauser, Director and Producer

“We tend to do what is easiest and quickest, and that is where ‘like me’ bias comes into play,” said Hauser. “It’s easiest to hire someone just like us, because that is who we feel most comfortable with. We know diverse teams have higher profitability and do better. But how do you explain that to startups that are already highly profitable?”

“Diversity actually will increase the universe of possibilities, solutions and ideas considered. But it often creates a sort of friction, so it slows things down,” Jerry Kang added. “The way the answer is created, and the number of solutions is expanded – that’s what’s improved.”

Tools are being built to try to remove bias in hiring and in some aspects of the criminal justice system. But we’re learning that we need transparencies about algorithms and a deeper understanding of the problem for them to work well. That’s easier said than done.

“We live in a society that suggests certain societal systems are colorblind. That is just not the case. The very notion of justice is a blindfolded woman holding scales. In criminal justice, bias matters a great deal,” said Ronald Tyler, commenting on the predictive tools used in courts that have come under scrutiny.

What is the role of technology?

James Zou, who recently published a paper about algorithms and bias, found that while technology can help us face our biases, the type of readily available data used to build many AI systems and models already has a bias. Zou’s research found many gender, ethnic and cultural biases and stereotypes already exist in the English language. In other words, the intelligence systems aren’t the problem, but rather the standards used to build them.

“A big part of our work is to identify biases in the algorithms you’re interacting with daily.”

James Zou, Stanford University

“A big part of our work is to identify biases in the algorithms you’re interacting with daily and come up with ways to de-bias these algorithms,” Zou said. But then the question is – who decides what to de-bias, and how? Different cultures, groups and companies have their own interests and objectives.

Zou has one possible approach. He and his team have been exploring the biases of people in the U.S. during the past hundred years, applying different algorithms to digitized versions of all U.S. writings from that time period.

“We could reconstruct what were the biases for gender and for ethnic groups for the last 100 years,” he explained. “That’s one example where, if we use technology properly, it can really become a powerful mirror to study our own biases.”

The documentary also highlighted how technology can be used to try to change our initial response to unconscious bias and possibly retrain our brains. For example, virtual simulations are being built to help retrain law enforcement agents by reframing the brain’s initial trigger of bias.

Another approach perhaps has nothing to do with technology. Tyler takes his Stanford law students to San Quentin to interact with the prisoners as relatable human beings. This simple thing may transform long term how future judges make decisions.

“How do I put myself in touch with people who I might otherwise have these negative biases for?”

Ron Tyler, Stanford University

“Think about: How do I get out of my normal community and into the broader community,” advised Tyler. “And how do I put myself in touch with people who I might otherwise have these negative biases for?”

The reality is that AI developments will continue, and technology will enable business solutions we haven’t yet imagined. “We need to partner and create an open dialogue about how we build a better framework for AI and immersive solutions that are not only good for business but also good for people,” Roy said. “While documentaries like bias can help start the conversation, it’s up to the collective us to be deliberate and ask the hard questions and make decisions about the kind of world we want to build.”

View More AT&T Foundry Stories