![]() ![]() ![]() Advocates for responsible uses of AI say that facial recognition technology often disproportionately misidentifies people of color, making it more likely that law enforcement agencies using the database could arrest the wrong person. “Clearview AI’s investigative platform allows law enforcement to rapidly generate leads to help identify suspects, witnesses and victims to close cases faster and keep communities safe,” the company says on its website.īut Clearview has faced other intense criticism, too. The company says its tools are designed to keep people safe. “And when it comes to the people whose images are in their data sets, they are not aware that their images are being used to train machine learning models. They don’t ask for consent,” says Abeba Birhane, a senior fellow for trustworthy AI at Mozilla. The vast majority of people have no idea their photographs are likely included in the dataset that Clearview’s tool relies on.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |