Digital analytics tools have become a crucial part of most businesses’ toolbox, but wielding those tools successfully requires thoughtful leaders who can consider the ethics of each decision while always keeping the human element in mind.
That is the kind of leadership Professor Brent Kitchens strives to cultivate in his McIntire School of Commerce students. Kitchens teaches in the School’s M.S. in Commerce Program and is an Associate Director of its Center for Business Analytics. His research focuses on how business leaders can use big data analytics and data-driven decision making to grow and compete.
“Business analytics is a very broad field, ranging from something as simple as being able to summarize and visualize data coming in to put it in front of an executive, to artificial intelligence models that make a prediction and automatically take an action based on that prediction,” Kitchens says.
Better Decisions with Data
Kitchens and his colleagues in McIntire’s M.S. in Commerce Program give students an overview of what can be done with different digital analytics tools.
“While we go deeper in certain areas, our primary objective is to provide a broad view and make people conversant about the tools and technologies to make decisions with data, including data management, databases, predictive models, and neural networks.” he says. “We also spend a lot of time learning through projects that take students through the entire process of transforming data from raw information into something that adds value.”
Many graduates, Kitchens notes, go into consulting, a field that requires them to work with a variety of clients and use data and data analytics to improve their clients’ business.
However, no matter what field they go into, Kitchens cares most deeply that his students understand the ethical and societal implications of their decisions. Digital analytics draws on vast quantities of data—consumer behavior online, past purchases, identity markers, text analytics, and much more—to deliver insights with an unprecedented level of detail. Such immense technological resources have a lot of power, and those deploying them have ethical responsibilities.
“There is a very broad set of possibilities, and those possibilities come with some ethical concerns. What is the right way to use all of this data? Is it ethical and beneficial to all stakeholders involved?” Kitchens says. “Those are the themes that I try to incorporate into all of my classes. I believe that the most important way I can get that kind of thoughtfulness out into the world is through my students who will become our next business leaders.”
The key, Kitchens says, is using analytics to inform and supplement human intuition and decision making, rather than replacing it.
“One thing I talk about a lot in my classes is a shift in decision making. Decisions in the past have been made largely on intuition, which is important,” he says. “Analytics does not replace intuition, but it should inform it. It allows leaders to augment their abilities to understand what is happening at a high level and either reinforce or challenge those assumptions.”
Marketing executives can easily speculate about what will catch a consumer’s attention, for example, and their assumptions are often accurate. But, analytics gives them granular data about individual consumers to confirm those assumptions, prompt new ones, and customize their products.
Questioning the Algorithms
Much of Kitchens’ research focuses on what happens when that customization goes awry in media—ethically if not technically.
“I am really passionate about how social media platforms influence our news consumption and general information availability,” he says.
In 2020, Kitchens and fellow McIntire faculty members Steven L. Johnson and Peter Gray published a paper on how social platforms can create echo chambers, where they only hear content they agree with and often miss news or perspectives that might challenge their assumptions. The paper received a lot of attention at the time, and two years into another election cycle, it remains very relevant to a polarized electorate.
Kitchens says that it doesn’t have to be this way. Different social media platforms have different impacts based on how they structure their algorithms. In fact, the Kitchens, Johnson, and Gray research found that Reddit, which is more topic-based, had a moderating effect on the types of news people consume. Facebook, which bases its newsfeed on likes and comments, had a polarizing effect, and Twitter, which was purely chronological at the time of the study, had little significant effect.
“It is not useful to talk about platforms as a monolithic thing, but rather to look at the nuance,” Kitchens says. “In general, if an algorithm cares about engagement, that is going to create certain incentives and certain outcomes than a different algorithm prioritizing something else.”
These are exactly the types of problems that Kitchens wants his students to think through. What kind of algorithms will provide the most beneficial news environment to the highest number of people? What does a beneficial news environment actually look like? Can we create it?
To wrestle with those types of questions, Kitchens has his students review examples and articles and engage in group discussions. One week, for example, they talk about algorithms that predict recidivism—another offense by someone released from prison—being used by parole boards. Is there bias built into those algorithms? Are they actually fair to the prisoners whose lives or at stake, or to the civilians they aim to protect?
“I try to have them step back. It is very easy to think, ‘Oh, this is just how things work,’” he says. “I want them to question their assumptions and look at how the consequences could play out, the impacts on different populations and on their organization.”
After all, Kitchens says, today’s students will be the ones using data to make tomorrow’s big decisions.
“They need to have an appreciation for what can happen when they use their knowledge responsibly.”