Computer system authentication is critical to keeping systems safe from unwanted users, intruders, and abusers. Basically, authentication is the process of the potential user proving they are an authorized user.
As most of you know, authentication into secure computer systems generally falls into three different types:
- What you know (i.e., a password)
- What you are (i.e., biomterics)
- What you have (i.e., a smart card)
With the many failures of password protected systems and the difficulty of getting people to remember more complex passwords, many organizations are plunging headlong into implementing biometric authentication. The promise here is that biometrics are a far more secure method of authentication as they use unique physical features of the user for authentication.
Biometrics are "what you are." These might include your fingerprints, your retina scan, your iris scan, your facial recognition scan, or maybe, eventually, a DNA scan. These biometrics are more precise and unique in identifying the user of the computer system.
While password can be guessed or cracked (especially now with increasingly powerful GPU and ASICs cracking systems), you can't really guess someone's retina scan. (Of course, you could extract his eyeball and put it up to the retina scanner like in Angels & Demons or Demolition Man.)
These techniques are touted as the ultimate solution to the questions and problems regarding authentication. The thinking here goes, since a fingerprint or retina is supposedly unique, then if we can use these biometrics to authenticate, the hacker/intruder can't authenticate against the system without those biologically determined traits.
The problem with biometrics as an authentication scheme is not that the biological trait can't be stolen or replicated, but the file that includes the biological trait can be stolen. When we create a biometric authentication system, every authorized user must scan their particular biological trait (say fingerprint) into the system. This trait file is then digitized and stored, usually in a database. That digitized biometric data can then be stolen.
When Apple first introduced their fingerprint authentication system (Touch ID) for their iPhone, within 24 hours hackers had found a way to steal the fingerprint file. No need to replicate the fingerprint, simply steal the file with the scanned fingerprint and then re-use it.
Unlike passwords or cards, when a biometric is stolen, I can't simply replace or change that biometric trait. That trait is for life. When my password is stolen, I simply change my password. When my smart card is stolen, I replace it. When my fingerprint is stolen, I can't change my fingerprints (well, at least, not without surgery).
Once the hacker steals my fingerprint or other biometric trait, they have it FOREVER. Whenever or wherever that biometric trait is used for the rest of my life, that hacker can use my identity to authenticate to systems that only I should be able to enter.
I think the industry needs to pause before heading headlong into this biometric craze and try to determine the long-term consequences. The long-term consequences may create a problem of a greater magnitude than the one it is meant to fix.
Ultimately, this move to biometrics may be an advantage to the hacker/intruder. If the biometric trait file is captured, the user has no option to change their authentication trait as they can't change their retina or their fingerprint. Once the biometric authentication is compromised, it is compromised forever.