iMBA Program

Welcome iMBA Students!

The iMBA team is available to assist you at all times. We recommend that you bookmark this page for future reference, as it will be very useful as you complete the program. If you have any questions or issues, please contact us at i-support@illinois.edu.







iConverge

iConverge is our annual on-campus networking and professional development event. It is a chance for students to see familiar friends, make new ones and develop professional relationships outside the classroom.

Learn More

Graduation

Gies College of Business grants degrees three times a year -- in May, August, and December. You need to submit an application for graduation in your final term in order to place your name on the degree list and receive your diploma.

Learn More




News and Events

Gies research reveals AI chatbots influence racial stereotypes

Aug 21, 2023, 08:24 by Aaron Bennett
In a recent study, participants who negotiated with Black avatars - as compared with Asian or white avatars - rated the chatbots highest in not only competence but also humanness.

Gies Business associate professor Tiffany White and a team of researchers recently conducted a study that showed consumers’ perceptions of chatbots differ from their racial biases toward people.

With the rise of artificial intelligence and companies using chatbots to communicate with customers, the team explored how consumers perceive chatbot interactions. Their paper, “I’m Only Human? The Role of Racial Stereotypes, Humanness, and Satisfaction in Transactions with Anthropomorphic Sales Bots,” appeared in the Journal of the Association for Consumer Research. They also created an animated video summing up their findings.


Participants in their study negotiated the price of a three-night stay in New York City with the goal of receiving the best price possible. But they didn’t negotiate with humans. Instead, they negotiated with one of three chatbots, each representing a different race – Asian, Black, or white. The bots were identical other than in the appearance of their avatar. 

The research team measured participants’ existing racial stereotypes before the simulated negotiation and then measured their reactions to the chatbots after the negotiation. Participants’ responses before the negotiation reflected what many human-to-human studies have shown in the past – that people tend to stereotype Black people as less competent than Asian or white people.

But the negotiation experiment revealed the opposite once chatbots were a factor. Participants who negotiated with Black avatars rated the chatbots highest in not only competence but also humanness, and they were more satisfied with the outcome of their negotiation than participants who interacted with bots represented by Asian or white avatars.

White and her team attributed this result to expectation violation theory, inferring that because participants weren’t used to seeing Black chatbot avatars, they found the interaction unusual and possibly felt that it was a more human experience. White said she believes these results can be a catalyst for more research.

“There’s a lot more opportunity to further explore this possibility,” she said. “The data are suggesting that the race of the bot matters in a way you wouldn’t necessarily expect. Where something curious is happening, that’s where science happens. That’s why science happens. And the idea of our research really is to spur more research in this area to understand the impact.”

White explained that firms often turn to chatbots for efficiency and that they’ve become increasingly common in recent years, prompting the need for a better understanding of their impact socially and in the marketplace.

“Industries are finding themselves understaffed,” White said. “The technology is relatively less expensive, and it’s scalable, so firms are using it. It’s the new normal. interactions with technology are almost inevitable for consumers on a day-to-day basis."

But with little research exploring human-to-AI interactions in a marketplace context, White sees an opportunity for future studies to help firms understand the power and obstacles of using chatbots. And on a broader scale, she views the results as an opportunity to further explore the possibilities of chatbots helping to change negative stereotypes.

“Firms get to make lots of decisions,” she said. “They get to make decisions about whether they use these types of interfaces, these types of interaction agents. They get to make choices about whether they give them faces. And correspondingly, they get to make choices about what those faces look like. And what this research shows is one implication of one strategic decision that a firm might make – that the race of the bot matters.”

White’s coauthors include Nicole Davis from the University of Georgia, Nils Olsen and Vanessa G. Perry from the George Washington University, and Marcus M. Stewart from Bentley University.