Can justice be done quickly by algorithms?

European Court of Human Rights in Strasbourg.

An unfair eviction in Ukraine, a dispute over renting a house in Malta, the wrongful conviction of a Greek for defamation. These are a selection of cases recently decided by the judges of the European Court of Human Rights (ECtHR) in Strasbourg. They were also disputes that were ‘handled’ by a robot that correctly predicted the court’s decision.

Jury Chase is an algorithm developed by Groningen PhD student Masha Medvedeva and based on it uses the same information as judges. The jury determines whether the European Convention on Human Rights has been violated through a textual analysis, whereby it is known in advance which articles are involved. Take, for example, the defamation case mentioned above: the program determined that the defamer could still publish his statements under freedom of expression.

The algorithm has assessed the ECtHR’s cases in recent years and, according to the researchers, initially made correct predictions in three-quarters of the cases submitted. Now the percentage of marks has decreased Up to about sixty percent. Can that percentage be increased further? So we no longer need human judges? Researcher Medvedeva finds several objections and strongly opposes them.

Not very fair

Robot judges are not new to the judiciary. There are no mechanisms that actually sit in a judge’s seat and convict people, but particularly in the United States, the judiciary increasingly uses automated systems. For example, they help judges assess the likelihood of an offender reoffending or the risk of a person fleeing. Such systems are not yet operational in the Netherlands, but a court recently experimented with an algorithm It prepares relatively simple ‘traffic cases’ for a judge.

The ‘robot judge’ says the jury.

Additionally, there are mechanisms for predicting court decisions (for research purposes). The US Supreme Court is particularly famous. In 2017, scientists said Predict more than 70 percent of statements correctly, higher than the 66 percent achieved by human experts according to the same study. Some algorithms predict more than ninety percent of events correctly. Yet those programs that score well are often not entirely fair, Medvedeva writes in her dissertation. This is because the system analyzes the final report with the facts taken into account.

“The question is what are you ‘predicting’ when you use information from judgment,” says Medvedeva. “After all, in a judgment the facts are written and acquittal or conviction ensues. Many algorithms claim to be able to predict judgments, but they do not.

That might be good, she thought. Together with colleagues, Medvedeva developed Jury Chase, which tests ECtHR judgments based on information available before trial. The algorithm analyzes the words in the complaint and sees that certain words or their combinations are characteristic of (previous) reports. This type of text analysis is a commonly used method for predictive algorithms. “The algorithm doesn’t really understand what the case is,” says the researcher.

Featured by Authors

Climate Science

Insects on our plate

Humanity

‘Words like ‘climate whappy’ are common these days.

Biology

‘I want to share my wonder about nature’

Not for use in court

JURI SAYS CANNOT BE USED IN COURT. Medvedeva wants to show how predictive algorithms work, what they can and cannot do. For example, he found that his algorithm’s scoring rate dropped when it wasn’t repeatedly trained with new statements. “One of the reasons for this may be that our view (and judges’ view) of certain activities is changing. For example, what judges considered torture 30 years ago does not match the current definition,” he says. Such an evolution of the judiciary can be removed if you leave it to the means.

He also says that algorithms can actually maintain human error. The US algorithm for determining recidivism has been known to (mis)estimate the chances of black people being more likely than their white counterparts, even when race is not factored into the calculation. One reason for this is that black Americans are disproportionately targeted by police for certain crimes. The algorithm continues the discrimination already in the system.

A significant translation must first take place between victims describing what happened to them and what ultimately ends up being a charge drawn up by a prosecutor. This is also the case with human judges: courts prefer this kind of information in a very specific form. “Witness statements from people believed to be victims are not very suitable for assessment by these kinds of methods,” says Medvedeva. The information is often incomplete and favors one party in the case (in this case, the prosecutor). In fact, the algorithm does not ‘weight’ the interests of the lawyer and the accused like a real judge would.

Also, unlike a judge, a computer cannot explain how a particular decision was reached. A suspect always has the right to know why he was acquitted or convicted.

Find similar cases

Medvedeva sees a future for algorithms in the judiciary, but more in a role they already have. For example, to look at comparable cases, case law, or whether a particular case is likely to win.

Appointing a real robot judge (in the Netherlands) doesn’t seem that close. In the past, algorithms that perform risk assessments based on data from various databases have been used, for example, to identify potential benefit fraudsters. Here again, the problem arises that it is not clear how the algorithms work. A (human) judge in The Hague ruled in 2020 that the system – known as Siri – would no longer be used. The court argued that ‘looking under the hood’ was not enough. Additionally, it used a lot of data Excessive invasion of people’s privacy Thus a person’s right to private life has been violated. Similar problems arise in the judiciary.

Check Also

The chances of a soft economic landing shrink by the week

The chances of a soft economic landing shrink by the week

economy•25 Jul ’23 at 12:42•Modified on 25 Jul ’23 at 14:29Author of the book: Remy …

Leave a Reply

Your email address will not be published. Required fields are marked *