Algorithmic Bias: Who’s Training the Machines?

If you’ve never heard the term “algorithmic bias” before, you’re not alone. Despite many years in pursuit of the powers of machine learning and AI capability, the idea that human bias influence the algorithms we create is one we’ve only recently acknowledged.  

So, what is it really?

Algorithmic bias refers to the biases an algorithm picks up from its human creators, similar to how a child does from its parents. As artificial intelligence and machine learning have jumped out of science fiction into the real world, the potential negative impacts of biases in our systems have grown larger and more imposing.

But before we even begin discussing such a topic, we have to first break through a key myth most people have about computers and their programs: that they are objective. In reality, algorithms are as susceptible to influence as humans are because we’re the ones building them from the ground up in the first place. They don’t exist as entities in their own right, but as extensions of the human mind—just with more processing power.

That all sounds like a no-brainer when broken down that way, but there’s more. Here, we’re going to look at how machine bias may come about and the very real effects it has on the lives of people living today.

A quick look at cognitive bias

Cognitive bias

Definition: (noun) systematic patterns of deviation from norm and/or rationality in judgement.

Before we get into machine bias, let’s look at bias from a perspective a little closer to home—the ones that exist in our own minds. Cognitive biases are a recognised and highly studied field of modern psychology, touching on everything from the understanding of patterns to the influence of racism.

Basically, the human mind evolved to recognise patterns. It’s just what we do. The problem is that we are taught to irrationally see certain patterns over and above others. Eventually, we tend to ignore contradictory evidence, passing over blind spots in favour of sitting in our comfortable biases. After all, it’s human nature to enjoy a cosy comfort zone.

Some examples of cognitive biases include:

  • Bandwagon effect: the tendency to do or believe things because many other people do or believe them.
  • Confirmation bias: the tendency to search for, interpret, focus on and remember information in a way that confirms one’s preconceptions, ignoring competing evidence.
  • Dunning-Kruger effect: the tendency for unskilled individuals to overestimate their own ability and the tendency for experts to underestimate their own ability.
  • Hindsight bias: Thinking you knew something all along, but only after it has already occurred.

Keep in mind that most cognitive biases are almost entirely unconscious, like many of our psychological processes. They fly under the radar, which is why it’s so hard to detect them when it comes to building algorithms that eventually include those same biases.


How we raise an algorithm, and what we feed it

HowStuffWorks explained it like this:

“To make a computer do anything, you have to write a computer program. To write a computer program, you have to tell the computer, step by step, exactly what you want it to do. The computer then ‘executes’ the program, following each step mechanically, to accomplish the end goal. When you are telling the computer what to do, you also get to choose how it’s going to do it. That’s where computer algorithms come in. The algorithm is the basic technique used to get the job done.”

Some algorithms carry on with the exact set of instructions we’ve given it, while others do what is called “machine learning”, where they use the data to teach themselves new patterns and procedures.

In the modern world, machine learning is being utilised in every industry, from law enforcement to social media. While the automation of certain processes can make it easier for people to dedicate their energies elsewhere, the presence of unavoidable bias in these algorithms means they can end up having some severe impacts on the people the systems oversee.

Where does algorithmic bias come from?

Algorithmic bias comes from the pool of data the machine uses to train, or from the people creating the algorithm (usually unconsciously). This is usually because people have trouble seeing when there’s an issue with the program if that issue doesn’t directly affect their demographic. They don’t fix the error, widen the pool of data, or diversify their approach because they don’t see the issue in the first place.

That’s why it’s so important to have diversity.

“The Coded Gaze”: When Biased Algorithms Make Real Impacts

Something as ephemeral as bias can be hard to pinpoint, especially in a realm into which only few dare to venture. However, those who do move in this space have started to raise their voices about the very real impacts biased algorithms have on people. As it turns out, these biases primarily manifest around race, gender, and class, and these are three of the primary inequalities still currently plaguing our world.

 Joy Buolamwini, an American computer scientist and self-proclaimed “poet of code”, speaks about machine bias by using the term “the coded gaze”. In her words:

“Algorithms, like viruses, can spread bias on a massive scale at a rapid pace.”

Having worked on multiple projects that required her to use generic facial recognition software, one of the obstacles she encountered was the software’s inability to recognise her face due to her dark skin. When this kind of discrimination manifests in a light-hearted project like a Peekaboo Robot, it may not feel as significant to us.

But, as Buolamwini pointed out in her talk on algorithmic bias, this is not the full extent of these issues.

Real-life scenarios impacted by algorithmic bias

What happens when racially biased algorithms are used to select jurors? The jury becomes unbalanced, as these algorithms tend to favour people with white faces thanks to the imbalance in their creations. Therefore, a non-white defendant could face a jury with very little shared experience (and the ever-present threat of structural racism), thus facing the possibility of being unfairly sentenced.

What happens when a national healthcare algorithm used by insurers to determine the amount of money to give to an applicant is found to be racially biased? In one very significant US case, this very thing happened, and black patients who were sicker were continually underestimated and underfunded. In fact, researchers postulated that fixing the problem “would increase the percentage of Black patients receiving additional help from 17.7 to 46.5%”—that’s more than double!

What happens when medical apps tell women their heart-attack-like symptoms are most likely a sign of depression? This is just one of the many examples cited by Carmen Niethammer, a Gender Diversity Leader and Private Sector Development Expert. There is a history of women being excluded as representative samples from medical examinations, leading to a significant skewing of the results. Artificial intelligence programs created using these statistics and data are invariably going to end up biased against women.

These are just some of the ways in which the coded gaze can negatively impact someone’s life just because the world hasn’t yet learned to account for their existence. The time has come, at this intersection of rights battles and technological saturation, to ask ourselves what role technology really plays in our society.

Even bigger, it’s time to ask ourselves how we can divert the world from this problematic course of digital evolution.

Stay up to date with the latest and greatest technology updates with Pure SEO.

Keen on more thought leadership pieces surrounding the digital world? Sign up to our newsletter to receive updates straight to your inbox or start reading the rest of our blog now!

Courtney-Dale Nel

Courtney is a Content Writer on the Pure SEO team. They have a Bachelor in Behavioural Psychology, way too much experience working with pigeons, and a fondness for nachos that rivals most marriages.

Digital Marketing Agency

Ready to take your brand to the next level?
We are here to help.