Ever is a free app , launched in 2013, that offers unlimited space to back up your photos and videos, and then facilitates organizing and sharing them from the cloud.
Officially, its source of income is the users who subscribe to the premium version to have extra features. But the American media have revealed that all of these photos were monetized in a rather more controversial way .
As revealed by NBC News, the image of Ever users was being used without their knowledge to provide training data to the company’s facial recognition technology , which it then sold to security forces and private companies.
This was done through its Ever AI division, which on its website exhibits to potential customers its “private dataset of 13,000 million photos and videos”, from tens of millions of users in 95 countries. A dataset “in continuous expansion” and that Doug Aley, CEO of Ever, has already confirmed that it is fed with the data collected by his mobile app.
Aley has come to affirm that in no case have the images themselves been shared, or information that allows identifying their users. But the truth is that it is the information of the face of these users (a very detailed information, because most users have hundreds or thousands of photos in the cloud ) which allows the facial recognition AI Ever learn to recognize the human facial patterns , and that many users would not have uploaded their photos to the cloud of Ever having been clear.
And even now, the reference may be somewhat vague:
“To allow you to organize your files and allow you to share them with the right people, Ever uses facial recognition technologies as part of their service.” Your files can be used to improve and train our products and technologies.
Some of those technologies can be used in other of our products and services for corporate clients, including our facial recognition offers for companies. “
“This seems to be a flagrant violation of people’s privacy, ” says attorney Jacob Snow of the American Civil Liberties Union of Northern California. “What they do is take pictures of their users’ families, from an app private, and use it to develop surveillance technology. That is enormously worrying. “