MRI facial recognition threatens the anonymity of participants in medical studies according to the Mayo Clinic – FindNow
Connect with us

Artificial Intelligence

MRI facial recognition threatens the anonymity of participants in medical studies according to the Mayo Clinic

Published

on

Thousands of people around the world have ever undergone a medical test (from MRIs to genetic analysis tests, for example) in the framework of a clinical study. Those who participate in this kind of research do so because they count on their privacy being safe , since no data is kept that would allow identifying the subjects participating in them.

Unfortunately, the limit that so far established what kind of data can be considered ‘safe’ in that sense may have been outdated due to advances in the field of big data and artificial intelligence.

That’s what at least one team of researchers from the Mayo Clinic says, who have just published in the ‘New England Journal of Medicine’ the results of an experiment with which they show that it is possible to reconstruct a person’s face thanks to data collected in an MRI of the head , then allowing the reconstruction to be subjected to Azure-based facial recognition software. Successfully.

The researchers recruited 84 volunteers, ages 34 to 89, who had recently undergone an MRI during a clinical study. The volunteers were photographed from five different angles, and an attempt was made to reconstruct their faces from the available data: magnetic resonance imaging allows us to capture elements such as the contour of the skin , intramuscular fat, and the bone marrow of the skull, but not others very useful, such as bone or hair.

In 70 of the 84 cases, the algorithm used allowed the reconstructed face to be directly linked to its owner’s photographs , and in another 10 the software proposed the correct face among the first 5 options. In total, the technology was only unable to identify 4 of the 84 faces.

Where is the problem?

The great problem with this unforeseen route of ‘de-anonymization’ of medical data is that it opens the door for them to be used for commercial or, worse, criminal purposes (such as using them for blackmail purposes) . Right now, the only protection for the privacy of clinical test subjects requiring head MRIs resides with the fact that researchers accessing study data agree not to try to identify participants.

As Eliot Siegel, professor of radiology at the University of Maryland School of Medicine, explained to the WSJ, “The risk to the average patient is [now] very small … but as time goes on, the risk will increase and it is It is very important to keep this in mind as we continue to create ever-larger datasets intended to power machine learning. “