- Online Classroom
- Topics in Focus
- More Diplo Sites
Try removing an unwanted tattoo. Many dermatologists will tell you that removal depends on colour, depth, and length of treatment. In some cases, you can expect scarring or discolouration.
Whether or not you have a tattoo, the information you post online on yourself does. It takes a similarly complex process to remove online data – and the success is not completely guaranteed.
|(CC) Gavin Llewellyn www.onetoomanymornings.co.uk|
This analogy was drawn by the author of a reply, commenting on a blog post by Vibhu Norby on The Next Web, on ‘Why social media is endangering our future, and what we can do about it’. It’s a great analogy: we act on the spur of the moment, attracted by something that can have a long-term impact – only that very often, we fail to think twice.
The computer scientist describes his fear:
…‘that the gigantic brain of the Internet will be used to abuse people who have the least power – those who have had their lives shared in public and can’t get it back. I fear that companies and governments will make dangerous assumptions about you with your public data. I fear that health insurance companies will automatically raise my rates because I let my friends know I was traveling to a malarial area. I fear that auto insurance companies will raise the rates of people who say they are ‘“doin’ 105 on 105’” without realizing it’s a lyric from Kendrick Lamar. I fear that companies will reject people from jobs by algorithmically turning down candidates with lower perceived reading levels. And there are already plenty of known examples of governments data-mining to discover political dissidents.’
He also says suggests three solutions to fix our ‘algorithmically determined future’.
- Better privacy settings and a default setting set to ‘private’ rather than ‘public’;
- Social media networks and other online ‘archives’ need to provide easier, effective, and more straightforward methods for users who want to remove, hide or curate their online data;
- Companies who own these networks should create algorithms that can help users think twice about the content they are about to share.
Despite the excellent ideas put forward by the author, there is a fourth factor which cannot be overemphasised: the user’s responsibility.
Although the default setting is often set to ‘public’ – which I agree should change – no one is preventing us from going through the settings before posting that photo. No one is coercing us to post that comment which we will later regret. There is no excuse for not reading the fine print that pops up right before we agree to something. It may be fine, but it is not invisible. Fine print is too long to read? That’s no excuse, either. The first thing I learned in law school? ‘Ignorance of the law is no excuse’.
At the same time, literacy and awareness are not made of the same stuff, even though many companies owning social networks conveniently think they are. Hence the author’s three solutions above, and his plea for us, for all Internet users:
‘If you are reading this right now, you are somebody that can make a difference. Set your privacy settings on all of your apps and show your friends and family how to do the same. Demand better privacy policies from companies that store your data. Before sharing something, think twice about whether you want that information to be searchable 20 years from now. I am hopeful that as an industry, together we can take back control of our future.’
Nobody wants to be nameless, aimless? (says Dizzee Rascal). Nobody wants to be remembered for the wrong reason, either.