Skip to: Main Content Search Navigation Secondary Navigation
Make It Known

TrollBusters founder part of AI panel at WVU Hackathon

Nov 13, 2017

(Left) Dr. Michelle Ferrier talks about the ways in which algorithms can contain biases that have devastating consequences for health care access, financial solvency, personal security and other issues. (Right) The TrollBusters bot interacts with targets on the Facebook page for TrollBusters. / Photo by West Virginia University and screenshot provided

TrollBusters founder part of AI panel at WVU Hackathon

Women gather to tackle themes of diversity in artificial intelligence

ATHENS, Ohio (Nov. 13, 2017)—One team pitched a kiosk-based healthcare assistant that uses personal data and regional trending data to navigate health symptoms. Another looked at detecting bias toward Muslim communities in journalism coverage. Women at the West Virginia University Reed Media Innovation Center tackled gender, racial, and religious biases during a hackathon this past weekend focused on diversifying artificial intelligence.

Associate Professor Michelle Ferrier of the E.W. Scripps School of Journalism participated in the opening panel with industry guru Susan Etlinger of Altimeter Group, Flynn Campbell, a human rights attorney and founder of Malena, and moderator Erin Reilly of West Virginia University. The event was hosted by WVU and Media Shift on November 10-12 at Morgantown.

Ferrier said that when women and people of color are absent from design teams, their needs are often ignored. The goal of the event is to help women understand the opportunities and risks of artificial intelligence and to brainstorm more inclusive AI.

Ferrier, founder of TrollBusters, shared the recent launch of the TrollBusters chatbot, designed to provide targets of online abuse with direction on what to do next. TrollBusters uses natural language processing to track targets of online harassment, providing just-in-time education, services and coaching.

“The bot lives on Facebook and allows us to be more responsive with providing our tools to targets,” said Ferrier. “Research shows that getting support and feeling supported affects the way targets process and manage online abuse.”

Ferrier said AI is in all our intimate spaces. “And we need diverse design teams to help AI learn cultural, gender and other differences,” she said.

-From faculty reports