The New York Times: Many Facial-Recognition Systems Are Biased, Says U.S. Study

Smart City

The majority of commercial facial-recognition systems exhibit bias, according to a study from a federal agency released on Thursday, underscoring questions about a technology increasingly used by police departments and federal agencies to identify suspected criminals.

The systems falsely identified African-American and Asian faces 10 times to 100 times more than Caucasian faces, the National Institute of Standards and Technology reported on Thursday. Among a database of photos used by law enforcement agencies in the United States, the highest error rates came in identifying Native Americans, the study found.

View full article in The New York Times

RELATED POSTS

Community Partnership Chat: Crossroads Rhode Island

March 23, 2023
In celebration of Women's History Month, Verizon's Adriana Dawson is sitting down with women who are inspiring her not only in Rhode Island, but beyond. Today, she's joined by Bernice...

FCC Grants $66 Million to Boost Awareness of Affordable Broadband Program

March 13, 2023
The US Federal Communications Commission wants to make sure eligible Americans can get internet service through the Affordable Connectivity Program, an initiative the agency calls the nation's "newest and largest" broadband...

Be a community partner in the 4th industrial revolution.

Learn More

US Tech Future is a Verizon-led community-focused initiative working to engage the local community in a discussion about technology and how it can improve the lives of local residents for their benefit and the benefit of the community as a whole.

Our mission is to engage with citizens and community stakeholders in USA to provide information on how technology can work to have a dramatic impact on the way we work and live in our communities.

@TheUSTechFuture