Biases. Preformed opinions. The conditioned response to feel a certain way about a specific stimulus. We all have them. We’ve been training our brains since birth to help us make sense of the world we live in. Many of them are innocuous. Some are not.
“I don’t like tuna fish sandwiches.”
“I like barbecued hamburgers.”
“I don’t like asparagus.”
All food-related opinions formed over the years based on our cultural programming. The things you were subjected to as a child, influenced by those around you. Maybe you don’t like tunafish sandwiches because they smelled funny in your lunch bag and one of your classmates made fun of you for it. Maybe you like barbecued hamburgers because the smell of charcoal reminds you of pleasant times growing up. Maybe you don’t like asparagus because you noticed that your pee smells funny after eating it.
Some of you right now will read that last line and think, “some people have a gene that allows them to smell a compound contained in asparagus.” Others might think, “that’s absurd. I just don’t like asparagus.” Still others might think, “maybe some people just produce smelly urine.”
Still others might say that some have this adaptation to detect plant-based neurotoxins.
The science is currently not entirely conclusive on the subject, but chances are, you’re already biased in one way or another towards asparagus. And probably hamburgers. And likely tuna (fish) sandwiches.
Our biases become more complex – more nuanced – when we start reading the news. The things we share on social media become a record of our internal biases. It is a picture of part of the neural network we’ve formed over the years from our earliest memories mapped out in a series of articles and stories taken from around the web. You might suppose that going through the things a person shares online, you could generate an approximation of that person’s public persona – a limited version of a person’s external projection of themselves (to paraphrase the Matrix).
It begins at an early age. Walking around the house asking our parents “why?” forms the early pathways in our brains and gives them the structure they need to bootstrap more complex thoughts and opinions. After years and years of training, we reach a point where we see a headline on a news article and think, “yes, this is true,” or “no, this is wrong.” Sometime we do this without even reading the article. Social media conditions sharing in people. Retweet this, reshare that. Often basing our decision to share on a headline that may not bear any resemblance to the content in a story. Or because it has a picture we find appealing.
This is bizarre behavior when you think about it. It’s a dopamine response to getting “likes” or “favorites”.
I have a personal example of a case where my own biases got me into trouble a couple of weeks ago. I was reading through one of the aggregators I have in my daily routine, and saw a headline about a supposed backdoor security hole in the popular online messaging app Whatsapp. Aha! Thought I. They were acquired by Facebook. Of course they’re up to no good. Also, it’s in The Guardian, a reasonably respectable, left-leaning news organization. (speaking of biases…)
I reshared the thing on Twitter and on Facebook. It didn’t take long before some Facebook people called me on it, telling me this had already been debunked by security experts and wasn’t true. Fake news! went up the cry. Or something like that. I felt a little silly about it, honestly.
Why did I share that article? Why did I think it might be true? Where does this preconception that Facebook is a nefarious, Orwellian monitor of thought come from?
I suppose it goes back to reading Orwell and Huxley and Bradbury and Dick during my formative years. I have a deep distrust of anything designed to catalog and identify individuals. It’s not paranoia if there is someone actually watching you.
Well that’s weird, you might be thinking. You’re probably right. That is your bias.
If I were to try to break it down more concretely, the feeling that Facebook is somehow sinister, I’d have to go back to the early oughts, when I was working for Canada’s Border Services Agency. Just after 9/11, I saw a memo about a DARPA RFP to create a system that could track and store everything a person did on the internet. I’m not making this up. It was worded just like that. I was working in the intelligence unit on some big brother type stuff at the time. We all laughed when we saw it. “That’s crazy,” we said. This was probably in 2002.
It was around that same time that we saw a demo of a facial recognition system being developed by an Israeli company, I think. It was crude. They talked about fingerprinting facial keys and showed a demo of a system that could take a surveillance image and compare it to a primed database full of mug shots and score a hit with about 60% accuracy. The face had to be pretty clear on the surveillance image, at the right angle, not obscured by glasses or hoods or hats. And it was slow. Running on a dedicated box with all the correctly-massaged and cultivated data, it still took an uncomfortable number of seconds to score a hit. We calculated that it would take a good amount of time to scour a national intelligence database-sized load to find a particular face. Like… probably minutes.
That was in 2002.
When I first saw Facebook in 2006 or 7, I was horrified. Literally. I think my jaw fell through my ass. “You mean, people are going to put this stuff online… willingly?” “These buttons people are putting on these websites do what now?” Then I found out about Peter Thiel being an original investor and later learned about Palantir and now he’s an advisor to … well you know all that.
So yeah. Biases. We all have them. They don’t always make sense. Be careful what you share online. Read the article first. Consider it. Think about it. And if you feel strongly about what you’re reading, take a step back for a minute and ask yourself why you feel that way.
Share this on Facebook! Subscribe to my mailing list. I promise I won’t track you!
Have a nice day.