© 2024 KLCC

KLCC
136 W 8th Ave
Eugene OR 97401
541-463-6000
klcc@klcc.org

Contact Us

FCC Applications
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

OSU student heads research in developing anti-bias practices for AI

A photo of Oregon State University's Engineering building. The building is large with a half brick, half white façade and large modern windows. A large silver art instillation stands in front of the building in the shape of a spiral.
OSU
/
Oregon State University
Oregon State University's College of Engineering.

A doctoral student at Oregon State University’s College of Engineering has taken a new approach to making Artificial Intelligence less systematically biased.

In order to train AI models, computer scientists utilize publicly available data from the web. To cut costs and decrease data input, the information fed to AI often goes through a process called deduplication.

If a system is fed 100 photos, deduplication might filter that set down to just 50.

According to OSU doctoral student Eric Slyman, this procedure has led some AI models to filter out images based on factors such as race, age, and gender, affecting the context in which the AI operates.

This complication has led to inaccuracies, as AI might conjure exclusively photos of white males when prompted to depict a doctor, for example.

“If we’re basing the decisions that these systems are making based on precedent of our past decisions as a society, then we know our society hasn’t always done things in a way that’s fair,” Slyman told KLCC.

Along with researchers at Adobe, Slyman developed a technique to reduce bias at the stage of deduplication.

The new model, called FairDeDup, aims to insert fairness considerations into the training models of AI.

“Instead of looking at how we change the context in what an AI says, we’re changing the context of what an AI learns,” said Slyman.

Tested alongside assistant professor in the OSU College of Engineering Stefan Lee, the technique has gained recognition as an innovative way to reduce discrimination in the AI space.

Slyman said they hope their work will help mitigate biases while also offering profit-increasing incentives for tech companies.

The full study on the FairDeDup method can be accessed here.

Cailan Menius-Rash is an intern reporting for KLCC as part of the Charles Snowden Program for Excellence in Journalism.