The New York Times: Many Facial-Recognition Systems Are Biased, Says U.S. Study

Smart City

The majority of commercial facial-recognition systems exhibit bias, according to a study from a federal agency released on Thursday, underscoring questions about a technology increasingly used by police departments and federal agencies to identify suspected criminals.

The systems falsely identified African-American and Asian faces 10 times to 100 times more than Caucasian faces, the National Institute of Standards and Technology reported on Thursday. Among a database of photos used by law enforcement agencies in the United States, the highest error rates came in identifying Native Americans, the study found.

View full article in The New York Times

RELATED POSTS
Two african american women sitting down looking at a tablet.

Small Business Digital Ready Webinar: Intro to Digital Inclusion

April 25, 2024
Click below view the Webinar, Wednesday (5/1) at 12pm ET and RSVP below! To learn more about Small Business Digital Ready, click below: Verizon Small Business Digital Ready

Verizon pledges free Internet for six months as ACP dies

April 18, 2024
Verizon said it will offer a discount to its "Verizon Forward" subscribers that will effectively make their Internet services free during a six-month promotional period. The company's offer arrives as...

Be a community partner in the 4th industrial revolution.

Learn More

US Tech Future is a Verizon-led community-focused initiative working to engage the local community in a discussion about technology and how it can improve the lives of local residents for their benefit and the benefit of the community as a whole.

Our mission is to engage with citizens and community stakeholders in USA to provide information on how technology can work to have a dramatic impact on the way we work and live in our communities.

@TheUSTechFuture