British police officers are among those concerned that the
use of artificial
intelligence in fighting crime is raising the risk of profiling bias,
according to a report commissioned by government officials. The paper warned
that algorithms might judge people from disadvantaged backgrounds as “a greater
risk” since they were more likely to have contact with public services, thus
generating more data
that in turn could be used to train the AI.
“Police officers
themselves are concerned about the lack of safeguards and oversight regarding
the use of algorithms in fighting crime,” researchers from the defense
think-tank the Royal United Services Institute said. The report
acknowledged that emerging technology
including facial recognition had “many potential benefits”. But it warned that
assessment of long-term risks was “often missing”.
No comments:
Post a Comment