Fb’s AI mannequin SEER was designed to exclude Instagram photographs from the EU in its dataset, prone to keep away from GDPR violations — The workforce purposely excluded Instagram photographs from the European Union, probably due to GDPR — OneZero’s Basic Intelligence is a roundup.
The company scraped 1 billion images from Instagram to train its artificial intelligence , but did not include images from European users in its training
There is a good chance that the team intentionally excluded Instagram images from the European Union in order to comply with GDPR
Researchers at Facebook announced a breakthrough yesterday demonstrating the algorithm does not need human labeling to learn how to recognize objects accurately. They trained a “self-supervised” algorithm with one billion Instagram images, which proves the algorithm is not dependent on human labeling to learn how to recognize objects accurately.
In order to create an image recognition algorithm that is as accurate as possible, humans need to identify images as containing dogs, horses, people, or anything else they think contains that particular subject matter, and then the algorithm can detect similarities between images in which humans have indicated the objects are similar. As the head of Facebook’s artificial intelligence department, scientist Yann LeCun has been on a mission for decades to change A.I.’s reliance on labels, which has been labeled the “holy grail” of artificial intelligence, to be fully automated.
In order to train the algorithm, Facebook did not simply choose any billion Instagram images from the web. It was the team’s intention to exclude Instagram images from the European Union, as they noted in their paper that they were “random, public, and non-EU images”. While the rest of the world’s Instagram images are fair game, EU residents have no need to worry about their images being used to produce Facebook’s next big algorithm.
It was OneZero that asked Facebook if the exclusion was due to the GDPR regulations in the EU, which provide users with greater insight into how companies use their data in order to protect against data use without their permission. As soon as we attempted to reach a Facebook spokesperson for a response, she acknowledged the question, but failed to answer it immediately.
Regardless of whether it was due to the fact that the use of data would infringe upon the GDPR or simply because Facebook didn’t want to appear impropriety, it’s likely that the law had a chilling effect on the way private data was being used.
In a message to OneZero, Jules Polonetsky, CEO of the Future of Privacy Forum, said it is not uncommon for companies to err on the side of caution when collecting data in the European Union as a matter of precaution.
It is common for global companies to limit how they use data covered by GDPR in order to meet the requirements of this law, he wrote, mentioning that it is often necessary to obtain explicit consent before using sensitive data.