The New York Times: Many Facial-Recognition Systems Are Biased, Says U.S. Study

Smart City

The majority of commercial facial-recognition systems exhibit bias, according to a study from a federal agency released on Thursday, underscoring questions about a technology increasingly used by police departments and federal agencies to identify suspected criminals.

The systems falsely identified African-American and Asian faces 10 times to 100 times more than Caucasian faces, the National Institute of Standards and Technology reported on Thursday. Among a database of photos used by law enforcement agencies in the United States, the highest error rates came in identifying Native Americans, the study found.

View full article in The New York Times

RELATED POSTS

AI Fireside Chat: Cathleen Finn & Sarah Fraim, CEO of Mass Technology Leadership Council

May 8, 2024
At Verizon, innovation is at our core, and we’re inspired by organizations leading their communities into the digital age. That’s why Cathleen Finn was so excited to sit down the...

Building An Inclusive Ecosystem for Chicago’s Diverse Tech Founders

May 7, 2024
Verizon Impact Forward Blog Series Chicago is rapidly emerging as a hub for tech innovation, with a remarkable 18% growth in the city's tech workforce over the past decade. As...

Be a community partner.

Learn More

US Tech Future is a Verizon-led community-focused initiative working to engage the local community in a discussion about technology and how it can improve the lives of local residents for their benefit and the benefit of the community as a whole.

Our mission is to engage with citizens and community stakeholders in USA to provide information on how technology can work to have a dramatic impact on the way we work and live in our communities.

@TheUSTechFuture