Researchers from New York University recently presented findings of a study on biometric data security and artificial intelligence. With the help of machine learning, 4 out of 5 attempts generated ‘fake’ fingerprints that are read as real by the most recent scanning tech. Recent concerns about having our private data stolen fade in comparison to this new level of insecurity connected to our future of fingerprint collection.
The machine learning game steps up as the newest DeepMasterPrints technology progresses. This new development, which can generate artificial fingerprints that imitate real ones, has an unprecedented level of accuracy, with only one out of five samples recognized as fake. But how does it manage to fabricate 'real' data so easily? And what does this mean for our biometric security?
The technological problem of Biometric data
The most fundamental problem of fingerprint scanning is that the technology itself is not perfect yet. We need to have our fingerprints scanned for traveling to certain countries, unlocking mobile devices, online payments and even for some gym memberships. The technology was implemented into our everyday lives already a while ago and remains quite unchanged.
The mechanism behind the scanning procedure tends to oversimplify the unique patterns of biometric data.
However, it is not widely known that the mechanism behind the scanning procedure tends to oversimplify the unique patterns of biometric data: it does not read our fingerprints fully. Rather, it collects some parts and elements from the general pattern. Until now, this was seen as sufficient for the purposes it served. But obviously, these recent research findings raise serious questions about its adequacy.
An identical procedure is followed when the biometric data is “read”. There, too, the present system does not allow for picturing fingerprints fully: it merely scans the partial elements of the whole. This specific feature of the scanning procedure led to the first step in the machine learning study at New York University.
Since the system has its obvious weaknesses, the researchers tried to use the “partial scan against the partial records” comparison for machine learning purposes. Among humans, certain fingerprint patterns are more common than others. Having this information, with the help of careful processing, it appeared possible to replicate parts of the most common features. Our biological specificity was used to create artificial biometric data. As a result we get easily fabricated fingerprints claimed to be 'ours'.
Through the method applied in the study, we get easily fabricated fingerprints claimed to be 'ours'.
The research was based on the analysis of data from more than 6000 fingerprint samples. Even though the technology was developed to target cell phones, it could be more widely applicable. For example, it can be applied to personal data and identity thefts.
Machine learning has recently made huge steps in development, and is said to be one of the most dynamic fields in contemporary research. It is quite accurate, judging by the impressive results of this study. With the help of artificial intelligence, the fail rate of the fingerprint fabrication was one out of five. It means that four out of every five fingerprints created with the help of machine learning and computational technology passed for genuine.
However, fingerprints and biometric data are not the only things that can be fabricated by artificial intelligence. Among the things that can already be easily programmed are fake faces, video and audio images, voice generation, movement prediction and other personal features. Biometric data generation is thus an essential step forward in the chain of developments. In contemporary tech, there is much more to come, and artificially created fingerprints are just one element in the huge net of developments in that sphere. Researchers around the world keep on showing society how easy it is to fake almost any type of personal information, including the biometric data of which many believe that they are conclusive and infallible.
Growing privacy issues
With the ability to manufacture and copy fingerprints, privacy issues have become more pressing than ever. Using this kind of technology allows unlocking nearly any mobile device that is activated with the fingerprint scan function. But it can also work in border and customs control, criminal records, bank account access and so forth - we are fully 'hackable' now, DeepMasterPrints has a principle of the master key to all the rooms in the house. And look at the scale of things: with the help of this small bit of tech, a huge number of genuine fingerprints can be imitated.
This research shows that we are fully 'hackable' now.
The research thus highlights existing problems of biometric data technologies and data collection. The scientists promise not to expose the algorithmic part of the fingerprint creation, because it could immediately provoke the mass fabrication of our data. However, it makes us think about fundamental privacy problems that come with the growing digitalization of society. Are our private data really safe? In many cases we are forced to surrender our biometric data, but their security is not entirely guaranteed. It can be said that once we give away our biometric data, we surrender any form of control over it.
What can be done?
We give away our data and new threats of falsifying it keep appearing day by day. However, this could be prevented. One of the ways to stop it is to prohibit or restrict intimate data collection such as fingerprints or face scan. At this point of the time, however, this option is very unlikely and nearly impossible: many of the world’s security systems are already too deeply committed to these procedures, and there is limited interest in trying to change this direction.
Also, in spite of the privacy issues raised here, fingerprints collection is still one of the most secure ways to afford safe traveling or banking transfers. Rolling this back to the times where this was not an option could cause even more opportnities for fabrication of private information and identity thefts.
The second way would be to use other types of biometric data. Alexandr Honchar, biometric specialist, says that DNA, ECG, and EEG records are relatively easy to collect and much harder to fabricate. They also allow for different scanning devices that would use a more complicated algorithms of verification. Scanning the gait, voice, iris, face features and much more are now actively being developed in view of implementation in the very near future.
Different types of biometry could be merged to allow for more secured usage, leveling out the various limitations in dealing with mistakes and privacy issues.
There is still a chance of partiality, where none of the above mentioned biometric measuring devices would cover the reading of the data fully. In such cases, different types of biometry could be merged to allow for more secured usage. They would level out the various limitations in dealing with mistakes and privacy issues.
The measuring devices also do need to progress with the times. When we think about the DNA or ECG verification, long and tiresome analytical procedures come to mind. However, in this respect the convenience of contemporary technical solutions is improving, increasingly allowing for small and compact devices to read our data within seconds.
For example, the analysis of DNA would be accelerated with the help of the spit or sweat scan, or by means of micro ink injected under the skin (which one wouldn’t even notice). For ECG and EEG data, wearable tech gear is already being developed to ease the procedure.
Is there a particular interest in implementing ways that would make data fabrication harder? The application of a new protection system will take billions of dollars to develop and launch. Even if more advanced biometric data collection will be implemented, years will pass until it reaches all corners of the world. Before that, no one is safe from the issues connected to privacy.
What this new research makes compellingly clear is the inherent danger attached to the normalization of collecting and using biometric data.
What should also be widely discussed are the risks of the application of more fundamental biometrical data collection. We have seen the possibilities for faking the basic and most intimate type of data, namely fingerprints. What will happen if we give away our DNA samples and where are the guarantees that there won’t be a way to manipulate them? Answers to these and many more questions demand an answer in the immediate future. And such questioning should not stop: recent trends towards more advanced biometric data use will constantly invite, and intensify, issues of this kind.
What this new research makes compellingly clear is the inherent danger attached to the normalization of collecting and using biometric data. Governments should think carefully before implementing biometric data procedures, and citizens should take a firm critical and inquisitive stance against plans for their implementation. Privacy is still crucial to a democracy, it is the precondition for being really free. In the digital age, privacy will need to be fought for.
Bontrager, P., et. al. (2018). DeepMasterPrints: Generating MasterPrints for Dictionary Attacks via Latent Variable Evolution. New York University Tandon.
Hern, A. (2018). Fake fingerprints can imitate real ones in biometric systems – research. The Guardian.
Honchar, A. (2018) Personal Blog of Alex Honchar about AI advances.
Oberhaus, D. (2018). Researchers Created Fake 'Master' Fingerprints to Unlock Smartphones. Motherboard.