- In a nutshell: In 2014, Cambridge Analytica needed lots of data about the public to power its political influencing technology that used hyper-targeted ads to attempt to changing people’s voting intentions.
- So, it used data from a personality test app
created by a Cambridge academic that (due to the way Facebook worked back then
) gave them not only detailed information about everyone who had taken the test, but data about their friends, too.
- This meant they had data on 50 million people who had never given permission to (and had likely never heard of) Cambridge Analytica. This was then used alongside other datasets to allow the company to use detailed ad targeting to very niche groups.
- Facebook is keen to point out their service was working as designed at the time and they did nothing wrong. You can argue that designed a system that allowed user data to be passed around unchecked was wrong, and doing nothing about this case for years, despite learning about it in 2015 was wrong.
- Facebook threatened to sue
the Guardian’s Sunday newspaper, the Observer, on Friday
- What next?
The public isn’t likely to care that much about this once the initial headlines are out of the way. It does increase the drumbeat
of calls for great regulation of big tech companies though – especially in the USA. EU countries have strict new data protection rules coming in May this year.