A recent report by Reuters suggests that Facebook has hired a team of contract workers in Hyderabad, India, whose job is to comb through millions of Facebook users’ photos, status updates and other content they have posted since 2014.
Facebook employees ‘label’ items into five different “dimensions,” or categories, and feed this data to the AI to further enhance its capabilities. Manual labeling of content, also known as “data annotation,” is something that companies seek to harness for AI training and other purposes.
However, it raises a serious question on the privacy of users who are oblivious to the fact that their personal data is being scrutinized at the hands of strangers.
Indian firm Wipro is one of the outsourced companies assigned with the ‘labeling’ task. It received a $4 million contract last year for this project and about 260 labelers have been analyzing posts from the last five years.
Once that was over, the team was cut down to 30 in December last year and they started labeling each month’s posts from the prior month. The project is expected to last until the end of 2019.
According to employees at Wipro, the project allows them to look through “a window into lives as they view a vacation photo or a post memorializing a deceased family member.”
Find your dream job
Wipro labelers and Facebook have confirmed that the analysis is being performed on all kinds of data such as text-based status updates, shared links, event posts, Stories feature uploads, videos and photos, including user-posted screenshots of chats on Facebook’s various messaging apps.
These posts come not only from Facebook but Instagram users globally and in multiple languages including English, Hindi, and Arabic. Moreover, Facebook admits that some of these posts, which include screenshots and comments, may include usernames as well.
The worst part is that the labeling process includes private posts too, and Facebook users aren’t even offered the chance to opt out of this. The new GDPR laws require companies like Facebook to provide more transparency and control over data to users. But in this case, Facebook’s data policy does not even mention manual analysis.
The social media giant, in its defense, claims that it has an auditing system in place “to ensure that privacy expectations are being followed and parameters in place are working as expected.” However, the important question remains unanswered — how can Facebook guarantee that such a level of access to private data cannot be misused?