Biometric Technology: A Brief History

0
394

The history of modern biometric technology began around the 1960s, which then evolved into high-tech scanners that read bio-makers with 100% accuracy. Additionally,  the Federal Bureau of Investigation (FBI) pushed for fingerprint identification in 1969, which led to the analysis of very small points, which helped to map unique fingerprint patterns.

While in 1975, the FBI also funded prototypes of scanners that could be used to extract fingerprints. Digital storage costs were prohibitive at the time. The National Institute of Science and Technology (NIST) worked on algorithms and compression, which led to the development of the M40 algorithm that the FBI used. The M40 algorithm was used to reduce the human search as it produced a significantly smaller set of images that trained and specialized human technicians then evaluated. These developments have helped to improve fingerprint technology.

As the 1990s arrived, biometric science took off, as the National Security Agency (NSA) established the Biometric Consortium. The Department of Defense (DoD), in partnership with Defense Advanced Research Products Agency (DARPA), funded commercial face recognition algorithms, and Lockheed Martin bought an automated fingerprint identification device for the FBI.

The history of technology has seen further developments, such as the establishment by West Virginia University in 2000 of a Bachelor’s program in Biometrics Systems Engineering and Computer Engineering. The International Organization for Standardization (ISO), an international non-profit organization that encouraged international collaboration in biometrics research, also helped standardize the use of generic biometric technologies.

The United States Immigration Department also used biometrics to process visa applications from legitimate travelers. This enhanced security as biometric data like voice samples, DNA swabs, and fingerprints was used to identify national security threats.

The rising prominence of smartphones was also a key component of the development of biometric technology as Apple introduced Touch ID to the iPhone 5S in 2013. Touch ID is a key feature on iOS phones and other devices that allows users to unlock their phones and use digital signature API with fingerprint authentication. Apple clarified that fingerprints are stored directly on the Apple chip and not on iCloud or Apple servers when they introduced Touch ID.

Today, many can utilize biometric identity verification for personal use as scanning sensors are integrated into smartphones. Millions of Samsung and Apple customers were open to biometric fingerprint scanners being added to their phones. With this, Apple then transitioned to face recognition as they released the iPhone X.

It is expected that biometric security and biology-based technology will be widely used to ensure data security and safety as 5G enables big data and the Internet of Things to be more easily accessible than ever. Standard bodies such as the W3C and FIDO regulate the use of biometrics when biology-based security as a futuristic concept is currently widely accepted.

For more information about the history of biometric technology, here is an article from Login ID.

Biometric Technology: a brief history

Leave a reply