“Bias is not inhuman. To have bias is absolutely human,” says Banu Raghuraman. If data enthusiasts are going to break the cycle of bias, however, “we have to be more aware of where the data is coming from, what it’s trying to tell us and how we can break the cycle [of bias]to build a better place for humanity.”

Raghuraman is a director for the Business Process & Analysis at Info-Tech Research Group and does a lot of her research and writing on the topic of empathy. In the video below, Raghuraman explains the lack of empathy in the technology industry as people lack perspective within their data points. She cautions that if we are not more aware when gathering data, human biases implemented in AI could ultimately lead to a corrupt system – and she gives tips on how to do that.

The problem with data

Having bias, says Raghuraman, is just as natural as having emotions. Specifically, the clear similarities between bias and anger. For example, it's completely normal to feel angry. However, when there is no logical basis for anger or when that feeling causes rash actions, problems can begin.

As a society, it's crucial to evaluate when we are acting on biases and eliminating them where possible. Like any other psychological problem, you first must acknowledge there is a problem.

Once recognize our own biases, we can allow ourselves to take a step back and adjust our mindsets. As copious amounts of data are constantly analyzed, we also need to check how the information was recorded and who recorded it. This allows for the checking for, and elimination of, possible bias from the recorder.

“We have a long way to go when it comes to institutions catching up to how far technology has evolved. And with data being easily accessible and all these powerful machines that we have today we can definitely look at bringing about a change, it all just is a matter of your mindset.”

One example of where she has seen bias in her own life is with her name. Her name and others that are hard to pronounce contributions to lower acceptance rates for resumes. This is the case for both human and machine resume filter. The current system penalizes perfectly qualified people for something that reflects nothing of their skills. If the resume filtering problem continues, it will also impede diversity in the workplace and lead to slower progress.

The world needs much improvement and to change it, understanding how identifying biases is key. So, become more aware of your biases and see how they affect your perception of others whether consciously or not. And remember Raghuraman’s words, “Change is ultimately in your mind.”

See more videos from Women in Analytics

Raghuraman’s video was part of a casting call sponsored by The SAS Women in Analytics (WIA) Network, a community advocating for more diversity in the data and analytics field.

The group focuses on the SAS values of authenticity, work/life balance and passion to inspire women who are pursuing a career in analytics. Learn more about WIA

To learn more about women who are elevating the field of analytics, visit  WIA | Top Women in Analytics | SAS

Share

About Author

Olivia Ojeda

Olivia Ojeda is a Marketing Intern on the Thought Leadership, Editorial, and Content team at SAS. She helps write and edit collateral and content. She has an Associates in Arts from Wake Technical Community College and plans to graduate with a degree in Business Administration/Marketing from North Carolina State in the next few years.

Leave A Reply

Back to Top