IBM has said it will no longer sell general purpose facial recognition software as a backlash over the technology continues to gather pace.
The move comes as scrutiny of digital surveillance mounts as protests over police brutality spread across the world in light of the death of George Floyd in Minnesota last month.
In an open letter to the US Congress, IBM CEO Arvind Krishna said: “IBM firmly opposes and will not condone uses of any technology, including facial recognition technology offered by other vendors, for mass surveillance, racial profiling, violations of basic human rights and freedoms, or any purpose which is not consistent with our values and Principles of Trust and Transparency.”
Krishna added that the company believes “now is the time to begin a national dialogue on whether and how facial recognition technology should be employed by domestic law enforcement agencies”. However, the company has not ruled out selling facial recognition services for specific purposes.
But some critics have questioned the extent to which the move will impact the adoption of facial recognition by law enforcement agencies. IBM was reportedly trailing the leading facial recognition vendors. While Google called for a moratorium on the software last year, Microsoft and Amazon Web Services are still selling facial recognition products.
Analysts have also cast doubt on how much money IBM was making from facial recognition software. Laura Petrone, an analyst at GlobalData, told NS Tech: “Among the big tech companies, Amazon and Google, which have heavily invested in machine learning and rely on advanced cloud operations, are the leaders in key technologies to enable computer vision [CV].
“IBM can also rely on a strong cloud business and powerful AI engine, but it simply can’t match the two companies’ repositories of visual data. On the other hand CV is sold as a type of software as a service and as we see an increased commoditization of this technology, particularly of facial recognition [FR]; the revenue growth is expected to slow down in the long term. This is to say that CV as a service wasn’t a particularly profitable business for IBM.”
Petrone added: “It looks like the protests across America after the killing of George Floyd will give new vigour to the debate around the use of FR technology. There are significant risks that FR used in law enforcement is unreliable. FR training data is often incomplete or unrepresentative of the general population.
“Studies have shown that it works differently across gender and races. The darker the skin, the more errors arise. In the debate around reforming the police in the US, the use of such inaccurate technology by enforcement authority will be extremely difficult to justify.”
This article originally appeared in NewStatesman.