Institutionalized bias is the concept that people — and organizations — unintentionally support negative stereotypes through their behavior. It isn’t a new topic in society, or in technology, but it continues to exist all around us. An instance of this bias in Splunk was recently brought to our attention, and I want to let you know what we’re doing about it.
Last week, a Splunk partner raised an issue with the use of the term “master/slave” in Splunk’s indexer clustering and licensing functionality. They let us know through a post on Splunk Ideas, a feedback portal where Splunk users share product ideas and the community votes on them. This post received upvotes faster than any I can remember, for good reason. We value open feedback from the Splunk community and respect them for holding us accountable and giving us the chance to do better. We’re addressing this issue as fast as we can, and will replace the terms with ones that are neutral, inoffensive and, in fact, more descriptive of the meaning.
I’m a computer science major, and I can tell you that software engineers and technologists learn these terms in the first year of college. They are ingrained in the way we talk about technology, such as in the domains of distributed systems and network security. We can — and should — find better terms that remove unconscious bias from our work. Just because these terms have been part of our industry vernacular for so long doesn’t make them any less offensive.
We've replied to the thread in ideas.splunk.com regarding “master/slave” and embraced this moment as an opportunity to improve. We will continue to update the ideas post as we resolve the issue. In addition to removing the use of those terms, here are additional changes we are making:
- We had already begun the process of removing “whitelist” and “blacklist” and replacing them with appropriate terms. An active dialogue is going on internally to determine what those terms will be. It won’t happen overnight, but we are working as fast as we can.
- We’ve brought together a working group of people from across the organization to develop additional recommendations, guidelines and procedures to identify and replace biased language and to prevent other instances from happening in the future. This team will also meet on a recurring basis to identify new opportunities to increase our efforts in this area, and help us determine the most effective ways we can help drive and support change throughout our industry.
- We will use Splunk's public-facing channels to share what we learn, the decisions we make and the changes we implement.
I want to thank Splunk partners, the SplunkTrust and my fellow Splunkers for reminding us that we must take ownership of this issue and be the change we want to see in the world. Taken as a whole, everything I’ve mentioned represents a small step in the right direction. There are still people in the technology industry — including some Splunkers — who see terms like master/slave as neutral and descriptive. They question whether it’s our responsibility as a software company to take a stand on this issue.
Let me be very clear about two things:
- Black lives matter.
- It is absolutely our responsibility to take a stand on this issue.
Institutionalized bias has no place in our products, our documentation, our language or our actions. As technology leaders, we must be aware of these issues and fix them. Splunk isn’t a media company. We aren’t politicians. But we reach thousands of people every day in many different ways. The status quo is no longer okay. Our words and actions have consequences. We are committed to listening whenever someone tells us we’re being offensive or insensitive. And we’re committed to using our words and actions to help promote equity wherever we can.