Len Klie

Voice Biometrics Gets an Added Layer

A team of computer scientists at Hong Kong Baptist University have put a new spin on common speech-based biometric authentication methods by developing technologies that can identify individuals through their lip movements.

The effort, led by Prof. Cheun Yiu-ming, involves matching the lip movements associated with saying specific passwords to other visual lip characteristics, including shape and texture. It is, appropriately, called the lip motion password.

The researchers claim that lip movements are unique to each individual and cannot be duplicated. The system they are developing can pick up those subtle differences and deny or grant access to systems or information based on them. “The same password spoken by two persons is different, and a learning system can distinguish them,” Cheung said in a statement.

According to Cheung, lip motion is better than voice activation, which can be inhibited by outside factors like background noise. Further, lip passwords can be used by people with hearing or speech impairments, he adds, and isn’t affected by language barriers.

The researchers have already obtained a patent for the method and expect to see it first deployed in the financial services industry for ATM verification and mobile payments. Another use case might be for building access.

Another stated goal of the project is to integrate the technology with facial recognition authentication systems for an added layer of security and verification.