From 1 July this year the judicial system of San Francisco (USA), while only in the form of an experiment, it is planned to introduce the practice of using artificial intelligence to help prosecutors to eliminate the possibility of racial prejudice when the indictment of a suspect, reports the portal the Verge.
As reported in the office of district attorney of San Francisco, the goal will be not only to analyze the police reports and automatically remove from them the data with which to determine the race of the suspect (color of eyes, hair, skin), but data about its surroundings (relatives, friends, neighbors, and so on), which indirectly may indicate the identity of the suspect to a particular race. In addition, AI will be removed from the records of the names and details of witnesses and police officers to exclude factors that may affect the impartiality of prosecutors.
“If you look at the prison population in the United States, it may be noted that colored men and women more than whites,” says the district attorney of San Francisco George Gascon.
Seeing the same name of the suspect, for example, Hernandez, investigators can immediately conclude that the person is of Hispanic ethnicity, and this in turn can affect the outcome and conclusions of the investigation.
It is noted that in 2017 at the request of the district attorney in San Francisco-researched statistics in criminal cases. It was found that during the period from 2008 to 2014, 41% of arrests were of African Americans. However, they accounted for only 6% of the total population. Analysts concluded that the decisions of the courts are “significant racial and ethnic differences”.
Gascon explained that at first prosecutors will examine the police reports processed and on their basis to make a decision about the charge. Then they will be able to browse a full report with all the names and data to determine whether there is in the case of extenuating circumstances.
Currently, the judicial system in San Francisco uses a much more limited procedure manual filtering of statements sent to the prosecutors have not seen this information. But as a rule are removed from the report only the first page, which contains General data about the suspect. In the report information about the person remains, therefore, on this sample there is no practical sense.
“We needed to connect to the work machine learning technology,” says Gascon.
According to him, this practice of using artificial intelligence in the United States will be applied for the first time. Gascon also said that does not know any law enforcement Agency that has previously used AI for the same purpose. He also added that at first the technology will be tested in his office. Then, if she shows good result, it can provide all prosecutors of the country.
The system was developed by the programmers, analysts and engineers from the laboratory of computing policy at Stanford University. According to one of the authors Alex Koklas wood, it is a small web application that uses several machine learning algorithms that automatically edit police reports, noting certain words and replacing them with neutral, “place”, “officer 1”, “suspect”, and so on.
To discuss the news in our Telegram chat.