Welcome guest. Before posting on our computer help forum, you must register. Click here it's easy and free.
Computer Hope forum e-mail issues and down time
0 Members and 1 Guest are viewing this topic.
Washington, DC (CNN Business)Federal researchers have found widespread evidence of racial bias in nearly 200 facial recognition algorithms in an extensive government study, highlighting the technology's shortcomings and potential for misuse.Racial minorities were far more likely than whites to be misidentified in the US government's testing, the study found, raising fresh concerns about the software's impartiality even as more government agencies at the city, state and federal level clamor to use it.
Amazon, which sells facial recognition software to police departments, was not among the participants. The NIST told CNN Business that submissions were voluntary and that Amazon informed the agency that it did not think its software was compatible with the test.
You are closer to being barred from posting News topics...