When I tell people I work in digital wellness, they usually chuckle and say something like, “Yeah, I know. Tech companies have too much power and do shady stuff. But, I’m addicted. What’re ya goin do?”
Big Brother may be controlling mysterious levers far, far away in Silicon Valley, but most of us have few complaints. Sure, at times our technology can be intrusive or distracting, but for the most part people tell me it’s equitable, helpful, and good.
And that’s an example of white privilege.
Technology feels equitable, helpful, and good for white folks because it was made for white folks, programed by white folks, (more specifically, white men) who work at companies run by white folks, (again, white men). As a consequence, it often does not reflect the experiences, needs, or concerns of BIPOC or other marginalized communities.
A history of exclusion
For BIPOC audiences in particular, Big Brother is invariably pulling racist levers, collecting and manipulating data that has historically been used to stereotype, marginalize, segregate, discriminate, and cause outright harm—from voter suppression to redlining in housing to disparities in health care to…. oh man, that list is endless. (If you don’t know about the history of racism in American, I highly recommend you do the work and start Googling.)
So it is no surprise that these same practices have been subtly baked into our technology, too.
I’m not saying any one piece of technology is inherently racist. Nor am I saying all tech is overtly designed to function in a racist way.
Nothing pops up when you log into Facebook that says, “No [fill-in-the-blank] allowed!” (That’s an overly-simplistic view of what racism–or sexism, or Islamophobia, or homophobia, etc.–looks like.)
I’m saying that a small cross section of largely white men–who have inherent biases and a very human tendency to see the small group they’re in as “normal” and anyone who falls outside it as “other”–have had, (and continue to have) a HUGE amount of control and power over the technology used by most of the population. That’s not a good thing.
Some thorny questions
Technology that’s convenient for some, may actually be harmful for others. For example…
- Could you see how trillion dollar companies knowing where you live and work, who you have relationships with, and how you spend your days might feel dangerous if you’re Black, queer, poor, senior, disabled, or a woman, especially when those same tech companies have already been sued for showing bias?
- Can you see the possible drawbacks of a society that supports luxury surveillance—where “people who believe they have nothing to hide willingly submit to surveillance, pay more for it, and put themselves into a special, highly privileged category of person” –a category to which some groups decidedly will never belong?
- When there’s proof that some tech companies have turned a blind eye to hate speech and misinformation, do you see how threatening that might feel for people who have historically felt the brunt of hate directed toward their communities, their religion, their sexual orientation, their gender, and their bodies?
- Can you see how the proliferation of devices that collect intimate biometric information about its wearer and their activities, (and uploads that data to servers, communities, and repositories) might be scary for people who have historically been the subject of racial profiling, medical experimentation, bias in healthcare, or stalking/domestic violence?
So, what’re ya goin do?
Sure, it’s lovely that Facebook will serve up pop-up ads for cribs and diapers to pregnant women. Just like it was probably lovely that doors were once held open for well-to-do white ladies. But it’s important to recognize that experience is certainly not everyone’s experience.
There IS racial discrimination embedded in much of today’s technology from coded bias in facial recognition software, to the digital redlining of Internet access, to racial profiling with Ring cameras and neighborhood apps, to using AI for predictive policing to discriminatory AI proctoring in learning software.
There are many people writing specifically about these issues more artfully than I. As someone working in digital wellness, I just wanted to acknowledge and amplify that work, because it’s important to talk about.
It’s time to stop laughing off the “shady stuff” in technology. Just because you think it doesn’t hurt you, doesn’t mean it’s not your problem. Gifts that come at someone else’s expense, should be everyone’s problem.