Clearview Pays £7.5m For Illegally Storing Facial Images
The facial recognition company Clearview AI has been fined more than £7.5m by the British regulator and told to delete the data of British residents. The company collects images from the Internet for a global facial recognition database.
The British Information Commissioner's Office (ICO) says that breaches UK data protection laws as Clearview AI has collected more than 20 billion images of people's faces globally to create an international online database for facial recognition.
In November 2021 the ICO said, “The ICO has today announced its provisional intent to impose a potential fine of just over £17 million on Clearview AI Inc, a company that describes itself as the ‘World’s Largest Facial Network... In addition, the ICO has issued a provisional notice to stop further processing of the personal data of people in the UK and to delete it following alleged serious breaches of the UK’s data protection laws.” At that time, British police forces including Scotland Yard, the National Crime Agency, Northamptonshire Police, North Yorkshire Police, Suffolk Constabulary and Surrey Police were among the forces understood to have tried facial recognition technology.
Now, in May 2022 the ICO says, “The ICO has fined Clearview AI Inc £7,552,800 for using images of people in the UK, and elsewhere, that were collected from the web and social media to create a global online database that could be used for facial recognition.” Furthermore, the ICO has ordered the firm to stop obtaining and using the personal data of UK residents.
Clearview AI, which is based in New York, takes publicly posted pictures from Facebook, Instagram and other Internet sources, often without the knowledge of the images platform or asking for permission.
John Edwards, UK Information Commissioner, said: "The Clearview company not only enables identification of those people, but effectively monitors their behaviour and offers it as a commercial service. That is unacceptable...
People expect that their personal information will be respected, regardless of where in the world their data is being used."
Clearview can no longer offered its services to UK organisations but, because the company has customers in other countries, it was still using personal data of UK residents.
“Given the excessive variety of UK web and social media customers, Clearview AI’s database is prone to embrace a considerable quantity of knowledge from UK residents, which has been gathered without their data,” the ICO said. Edwards also said the company “allows not only to identify ‘people in its database’ but to effectively monitor their behaviour and provide commercial services”. That is unacceptable.” Britain has become the fourth country to take enforcement action against the firm, following France, Italy and Australia. The ICO's action comes after a joint investigation with the Office of the Australian Information Commissioner.
Clearview's system allows a user to upload a photo of a face and find matches in a database of billions of images it has collected. It then provides links to where matching images appear online.
The ICO found that Clearview AI Inc breached UK data protection laws by:
- Failing to use the information of people in the UK in a way that is fair and transparent, given that individuals are not made aware or would not reasonably expect their personal data to be used in this way.
- Failing to ask for additional personal information, including photos, when asked by members of the public if they are on their database. This may have acted as a disincentive to individuals who wish to object to their data being collected and used.
- Failing to have a lawful reason for collecting people's information.
- Failing to have a process in place to stop the data being retained indefinitely,
- Failing to meet the higher data protection standards required for biometric data.
The ICO also found that, when asked by members of the public if they are on their database, the firm requested additional personal information, including photographs.
European countries including France, Italy, Greece and Austria have all condemned Clearview AI's method of extracting information from public websites, saying it violates privacy policies. In March 2020, Clearview AI was sued by the American Civil Liberties Union, which contended the company illegally stockpiled images of three billion people scraped from Internet sites without their knowledge or permission.
Prof. Fraser Sampson, the British Biometrics Commissioner, has recently warned police forces against deploying the technology to identify potential witnesses and not just suspects. Successive independent commissioners have warned that automatic facial recognition technology is even more privacy-invasive than the police collection of DNA and fingerprints. However, unlike those biometrics, the government has not put facial recognition images on a specific statutory footing which would ensure limits and oversights on how the authorities can use them.
The ICO fine is the third largest fine imposed in the UK since GDPR came into force in 2018, and the largest fine it has issued for a data privacy incident not caused by an external security breach. This looks like a statement of intent from regulators to clamp down on the inappropriate use of people’s personal data.
Paul German, CEO of security solutions firm Certes Networks commented “Organisations and CIOs / CISOs globally need to take note. They need to ensure that any potentially sensitive data they use, transmit and store is not only treated in line with the strictest data security principles, but also that they are clear and transparent around their data ethics."
ICO: ICO: Gov.UK: BBC: Daily Mail: The Register: Reddit: Polish News: News7:
You Might Also Read:
Facial Recognition Company Hacked: