By Megan Whelan – RNZ Digital Editor
Back in the halcyon days of the internet, the idea was that this new thing called the world wide web would right many of the wrongs of history.
It would allow everyone to have a voice, there would be a platform for those who hadn’t ever had one, and everyone would somehow play nice in this new utopia.
“We are creating a world that all may enter without privilege or prejudice accorded by race, economic power, military force, or station of birth,” John Perry Barlow of the Electronic Frontier Foundation wrote. “We are creating a world where anyone, anywhere may express his or her beliefs, no matter how singular, without fear of being coerced into silence or conformity.”
It’s a lovely idea, but ask any woman who’s shared an opinion - or a joke - if it has worked out that way. Even when we share things ourselves, unless we’re very rare, we’re sharing a curated and filtered version of our lives. No one’s kale smoothie actually looks that good.
“I just want younger girls to know this isn’t candid life, or cool or inspirational. It’s contrived perfection made to get attention,” wrote model Essena O’Neill when she quit Instagram, saying social media isn’t real life. “We’re a generation told to consume and consume,” she continued.
All that data, all those algorithms we hear about all the time, are driving that consumption. It’s an internet truism now that “if you’re not paying for the service, you’re the product.”
In some ways, those algorithms are helpful. When you’ve been googling laptops, and you flick over to Facebook and the ad you see is for laptop bags, that might actually be filling a need you didn’t know you had. Often they seem like blunt tools - an department store advertising something you’ve already purchased. Netflix knowing you might like Lords and Ladles because you watched Nailed It means you never have to search for something to watch again.
But the algorithms aren’t just trying to show you something, sociologist Zeynep Tufekci argues. They’re trying to change you. Whether it’s your shopping habits, your TV preferences, or your political views.
“As a public and as citizens, we no longer know if we're seeing the same information or what anybody else is seeing, and without a common basis of information, little by little, public debate is becoming impossible, and we're just at the beginning stages of this.”
For people interested in an open society where informed debate thrives, this is a terrifying prospect. Two thirds of American adults get at least some of their news through social media. A lot of us are reliant on the big companies - who are adamant they’re not media companies, and thus beholden to the same legal and ethical standards - for information about what’s going on in the world.
How can we have a debate if everything we’re seeing is through a different lens? If I’m only seeing positive brussels sprout posts, and you’re only seeing those in the negative, how can I convince you that brussels sprouts are excellent if you roast them with a little balsamic vinegar?
The internet where everyone was meant to have a voice has been swallowed by one driven from the shadows by what some would argue is “neutral” data.
But data isn’t fair, argues data scientist Cathy O'Neil. “Algorithms don't make things fair if you just blithely, blindly apply algorithms. They repeat our past practices, our patterns. They automate the status quo. That would be great if we had a perfect world, but we don't.”
In Germany this year, Facebook has had to delete hundreds of hate speech comments. Twitter has been accused of an anti-conservative bias. It turns out that a machine learned that pictures of people in a kitchen were more likely to be a woman. When Microsoft released a chatbot to Twitter, it learned how to be racist in a day.
In a world where people are rightly concerned about fake news, where expertise is derided, where comments sections are full of awful abuse, and where debates continue over what freedom of speech means in 2018, it might be worth asking those companies for a little information.
It’s not words blacked out in a letter - though Governments still do that - but it might be worth asking those services we use every day why we’re seeing what we’re seeing and when? Who controls the flow of information, and what measures are used to exercise that control? If history teaches anything, it might just be to ask those questions before it’s too late.
Read more perspectives on censorship at WW100.govt.nz/censorship