The New York Times: Many Facial-Recognition Systems Are Biased, Says U.S. Study

Smart City

The majority of commercial facial-recognition systems exhibit bias, according to a study from a federal agency released on Thursday, underscoring questions about a technology increasingly used by police departments and federal agencies to identify suspected criminals.

The systems falsely identified African-American and Asian faces 10 times to 100 times more than Caucasian faces, the National Institute of Standards and Technology reported on Thursday. Among a database of photos used by law enforcement agencies in the United States, the highest error rates came in identifying Native Americans, the study found.

View full article in The New York Times

RELATED POSTS

How Tribes Are Tackling Today’s Biggest Challenges

November 21, 2022
Join Verizon's Rima Qureshi for a one-on-one conversation with former Commissioner for the Administration for Native Americans, Lillian Sparks Robinson. The two discussed the current challenges that Indian Country is...

TechPoint: Verizon’s support of veterans and the VA is helping many Hoosiers too 

November 11, 2022
Verizon’s support of HVAF is helping homeless and at-risk veterans, and their families, get back on their feet and reach self-sufficiency in Indiana. Verizon has a similar relationship locally with Indy...

Be a community partner in the 4th industrial revolution.

Learn More

US Tech Future is a Verizon-led community-focused initiative working to engage the local community in a discussion about technology and how it can improve the lives of local residents for their benefit and the benefit of the community as a whole.

Our mission is to engage with citizens and community stakeholders in USA to provide information on how technology can work to have a dramatic impact on the way we work and live in our communities.