One word could help Facebook users understand why there is so much at stake when it comes to data privacy and leaked information. At a hearing, in front of the Senate Judiciary & Commerce Committee in Washington, Mark Zuckerberg spoke about how the biggest social media company on the planet leaked reams of user data. The culprit? It was a simple personality test, a third-party plug-in, that lured in users like ants to a picnic.
First of all I would like to say, from a personal level that as a users on Facebook, given just only that it’s a social media web space, as a registered user, I have the right to know whats happening to my data, so do you and everybody else!
They’re called psychographics, and they definitely catch your eye. Sen. John Thune from South Dakota called them a “quiz” during his opening statements. At the heart of the issue is that this data trap for 300,000 users led to the misuse of data from their friends and family on the social network, and–it must be said–the data trap worked. Quite famously.
A psychographic is like a personality test. You’ve probably seen them many times in your Facebook feed, and maybe even completed a few.
They tap into a basic human need to be understood, to share our identity, to become known. When we answer a few simple questions to let our friends and family know which Movie character we’re the most like or which car we’d drive if we had our pick, we’re sharing a digital piece of our analog lives.
It’s why so many people participate in surveys, and why your feed is so filled with the results from all of the people you follow. It’s all about connection.
And, it is also the root cause for how a company called Cambridge Analytica managed to collect the personal preferences for as many as 87 million Facebook users. We like sharing. Many of us fall for these personality tests because they tap into a basic human craving to reveal a lot about our own desires and dreams. It built Facebook.
From a security standpoint, there’s a clear lesson here. Security and privacy go hand in hand, and Facebook will have to figure out how to balance the need for privacy and how their business model depends on access to as much data as possible.
The most important company in tech is helping users connect with each other all over the world; now we know they are also helping themselves to our data and not protecting it. The business model will only work if Facebook can figure out how to assure the user base that their data is safe.
That could take years. I’m still a Facebook supporter and user, mostly because I know this business model is better than the alternative (that is, not having the services at all). I also know there’s a lot at stake here. We’ve been running along fine for the past decade sharing data and assuming the companies that process that data know how to manage it. We’ve been wrong. These are multi-billion dollar companies, but the profit has come from using and abusing our data like it is a treasure chest they found on a sunken ship.
But these are real users. The data is not a treasure chest to exploit, it is something Big Tech needs to place as a priority above all else. We need to see change. Our data is more like a set of jewels, a rich collection of information that is a goldmine for advertisers but also contains the memories, personal details, and insights of the public.
Here’s the rub. Facebook is beholden to their advertisers. That much is clear. The ads you see in your feed pay the salary of every employee at Facebook. Yet, the company needs to make a shift. Are the users more important? If you don’t protect the data, you are not protecting the users, and the advertisers will move on to the next app or service.
This is true of almost every Big Tech company, including Apple. When you treat user data like it is not that important, you suddenly become a data monster.
So what do we do about the word psychographic?
One important step will be to educate users. Who is really behind that data collection? Where does it go after you click submit? What’s the paper trail? Facebook obviously doesn’t know. They don’t have a clue. Users have the right to know exactly what happens. It’s not enough to know you are Jason Bourne or match up the best with Ned Flanders from The Simpsons. The data traps must end.
Your comments will be highly appreciated, also remember sharing is caring, let another know about ‘Psychographics’.
Latest posts by Peter Kivuti (see all)
- GDPR and and our Path to GDPR compliance - May 17, 2018
- Microsoft Intellicode: AI assisted development - May 8, 2018
- Facebook opensources its powerful AI Framework PyTorch - May 5, 2018