Facebook has admitted that “most” of its 2.2 billion users “could have had their public profile scraped” by third parties without their knowledge, and that the personal information of up to 87 million people was improperly shared with Cambridge Analytica, the company disclosed on Wednesday.
“In total, we believe the Facebook information of up to 87 million people — mostly in the US — may have been improperly shared with Cambridge Analytica,” said Mike Schroepfer, Facebook’s Chief Technology Officer.
Initial reports set the number of users affected by the CA data purchase at 50 million. The London-based political data company bought the data from two psychologists (one of whom currently works for Facebook) who developed a data harvesting app disguised as a fitness app.
One of the methods used by “malicious actors” to “scrape” user data has been to enter another person’s phone number or email address into a Facebook search, allowing information to be harvested or scraped. “We believe most people on Facebook could have had their public profile scraped in this way,” Schroepfer said.
The Wednesday admissions were accompanied by the announcement of nine major changes aimed at safeguarding user privacy following the data harvesting scandal that has pummeled Facebook stock and resulted in Congressional inquiries. CEO Mark Zuckerberg will testify before the House Energy and Commerce Committee on April 11, which chairman Greg Walden (R-OR) and Frank Pallone Jr. (D-NJ) said would be “an important opportunity to shed light on critical consumer data privacy issues and help all Americans better understand what happens to their personal information online.”
In addition to eliminating the ability to search for users by email and phone number, Facebook will also ensure that it does not collect the content of messages sent via its Messenger app or Facebook Lite on Android.
The Menlo Park company admitted to Bloomberg on Wednesday that it’s been scanning private messages between individuals communicating through Messenger to “make sure it follows the company’s content rules.”
The company told Bloomberg that while Messenger conversations are private, Facebook scans them and uses the same tools to prevent abuse there that it does on the social network more generally. All content must abide by the same “community standards.” People can report posts or messages for violating those standards, which would prompt a review by the company’s “community operations” team. Automated tools can also do the work. –Bloomberg
“For example, on Messenger, when you send a photo, our automated systems scan it using photo matching technology to detect known child exploitation imagery or when you send a link, we scan it for malware or viruses,” a Facebook Messenger spokeswoman said in a statement. “Facebook designed these automated tools so we can rapidly stop abusive behavior on our platform.”