For over a 100 years, we have used specific unique characteristics to identify individuals. This area of study, biometrics, has played an essential role in national security for nearly as long, and technology today is accelerating the speed at which we can identify individuals by their biometrics. This intersection of technology and identity creates new opportunities, and new challenges, with far-reaching impacts for safety, security, and defense.
Different federal, state, local and commercial organizations collect and use biometrics for managing identities across operational spaces, or domains, to safeguard the public. These various organizations have many different uses and applications of biometrics in their mission. In many cases, these systems and their associated data can affect critical life and safety decisions. Whatever the mission set, there are many factors that can affect the quality and usefulness of a biometric. Poor quality biometrics can have a negative effect on a system’s ability to recognize and produce consistent, timely, accurate, and reliable results from a biometric match, reducing the effectiveness of technology in accelerating the determination of identity. For example:
- Collection Conditions: Working outdoors can pose many challenges around collecting a high-quality capture. For example, direct sunlight can create capture quality issues with iris cameras and optical fingerprint scanners. Often, individuals collecting biometrics must work quickly under less-than-ideal circumstances, which can result in quality issues with an inability to retake the biometric due to logistic or safety issues.
- Devices used: Older devices have lower resolutions, are less sensitive, or can be more affected by environmental conditions. Sensors and lenses can become dirty or scratched, resulting in unintended image artifacts.
- Capture-side workflow processes: Technology is ultimately dependent on being user-friendly and tied to process. Depending on the capture device software, a variety of issues can occur. For example, if a quality score – a way of programmatically assessing the value of a specific biometric artifact – feedback is not available, an operator may not know to recapture a poor scan or image. Or if the software is difficult to use, it can inadvertently cause an operator to mislabel metadata, biographic data, or transaction data. In these cases, having a user-centered design can be absolutely critical to effective data capture and usage.
- Biometric Data Transmission Specifications: There are a number of schemas and specifications for tagging and transmitting biometrics and their associated metadata with esoteric names like EBTS and Homeland Security’s IXM. Incorrect use of tags and schemas or invalid translations between formats and specifications can introduce data quality issues that hinder a matching system’s effectiveness, reducing speed and accuracy.
Biometric data quality is the number one factor responsible for the performance of automated biometric matching systems, and a bad set of biometrics can have a cascading effect on accuracy and search times. These defects classes are difficult to identify in a collection once a system has grown to an operationally relevant size.
Using technology to create better quality, consistency, and speed requires applying new approaches and technologies. At Dev Technology, we have been working with our teams to blend cutting-edge Artificial Intelligence and Machine Learning with traditional infrastructure and standards-based technologies to accelerate these critical attributes throughout the biometric and identity management lifecycle. The speed at which biometric matches are captured, evaluated, and returned is a critical component to ensuring that we deliver for the missions that depend on effective biometrics and set the stage for even broader utilization of biometrics in the future.
Niroop Gonchikar is Dev Technology’s Technical Director of Biometrics.