When a Human Rights Watch report came out on May 1 linking artificial intelligence group Megvii Technology to police surveillance of a Chinese ethnic minority, investors in the startup grew nervous. To reassure them, executives of the company quickly jumped on a conference call and vowed to root out what was behind the story.
A month later, the New York-based watchdog corrected its report to say Megvii’s facial recognition Face code was in the police app but inoperable. How it got there remains a mystery, but by then the public relations damage was done, with some of Megvii’s shareholders, and their shareholders’ shareholders, questioned publicly about their views...