(English) Thanks to new software, a thief was apprehended in the Siberian coal-mining city of Kemerovo. The facial recognition system, which was developed by the Moscow-based startup, already helped identify over 180 wanted criminals during the World Cup in summer 2018.
When authorities in the Russian city of Kemerovo decided to test a new system of facial recognition to “see how it goes”, they didn’t expect to catch a criminal so fast. Only five cameras were installed in the local convenience stores. In just four days since the experiment was launched, the software identified a wanted robbery suspect.
FindFace Security is currently the only face recognition solution available in Russia. NtechLab, the startup that developed the system, already had positive results during the 2018 World Cup.
A total of 500 video surveillance cameras mounted around the city and stadiums were connected to FindFace. As a result, almost 180 wanted criminals have been successfully detained. Nine of them, according to Moscow Mayor Sergey Sobyanin, were caught in the metro.
“The case in Kemerovo demonstrates how facial recognition, even with a limited number of cameras and in the shortest time, can benefit society and business,” said Mikhail Ivanov, head of NtechLab. “FindFace Security is equally efficient on both small and complex objects. It is capable of processing video from hundreds of thousands of cameras in real time.”
The company claims that FindFace Security has more than 99 percent accuracy and can compare one face with another to identify if it is the same person or not. The software allows real-time, high-precision detection of faces in a video stream, comparing results with databases of wanted individuals, and sending alerts to the police in just two seconds.
The question whether or not the facial recognition software is a reliable way to identify suspects is still open. Many systems, even those developed by tech giants, such as Amazon and Microsoft, are controversial. For example, they struggle to recognize people of a different race other than white, or simply make mistakes.
A report from MIT’s Media Lab tested the facial recognition systems from Microsoft, IBM, and China’s Megvii and found that up to 35 percent of darker-skinned women had their gender misidentified by the systems. Back in 2015, Google identified a software engineer’s black friends in a photo as “gorillas,” and had to apologize for the error.
In summer 2018, the American Civil Liberties Union conducted a range of tests with Amazon’s Recognition system.
The software matched photographs of members of the U.S. Congress against a database of 25,000 publicly available shots of people who had been arrested. The system incorrectly identified 28 members of Congress as criminals.
NTechlab, however, claims their verification algorithm is the best in the world: the company has managed to take the top prize at a competition organized by the Advanced Research Projects Activity (IARPA) under the auspices of the Office of the Director of National Intelligence (ODNI).