“A misleading justice” .. How did the British Post algorithms destroy the lives of hundreds? | policy

London– The memory of the British street these days is recovering the details of a scandal that returns the beginning of its events for more than two decades, which was killed by hundreds of postal sector employees after the wrong data and defects of technical systems in misleading the judiciary and pushing it to support the charges of embezzlement against them.

British public opinion is awaiting what a independent judicial report will reveal new circumstances on this case, after the announcement of its first results a few days ago.

The initial results of the investigations linked 13 suicides in the ranks of the postal staff with false accusations against them with financial corruption, which later turned out to be based on wrong digital data.

Investigations showed that an imbalance in a technical program caused the disappearance of financial balances from the postal accounts, while the Royal Postal Administration claimed that this is due to fraud carried out by a number of employees.

The case sparked the anger of British public opinion after the report revealed that 13 post office officials committed suicide due to judicial prosecutions against them, west of London
Investigations showed that a defect in a technical program was the reason for the disappearance of financial balances from the postal accounts (Al -Jazeera)

Unlevantable errors

The British judiciary had brought between 1999 and 2015 charges that include theft and manipulation of accounts of about 1,000 post managers in BritainAnd sentenced about 236 of them in prison.

The Post Corporation to file these claims for years relied on data from the “Horizon” system, which was adopted by its software as compelling evidence to condemn the accused, and she adhered to insist that its data is accurate and is suitable to raise criminal cases against its employees, despite repeated alerts that defects in the technical system.

For two decades, affected and human rights staff led a campaign to reveal the falsehood of those charges, and filed 555 affected between 2017 and 2019, a collective lawsuit, after which the court canceled the late 2020, the convictions addressed to the accused.

And the postal staff “falsely accused of financial fraud” enjoyed a wide popular sympathy in Britain after the “ITV” channel last year presented a dramatic film depicting their suffering and the difficulties they faced after the judiciary decided to condemn them on the pretext that the data provided by computers issued by the technology of the margin of error in which the error margin is nonexistent.

The head of the investigation committee, Win Williams, said – in a speech after the publication of the first report – that the affected workers lived a “catastrophic” human suffering and severe pressure due to the prosecution and their lives and the future of their families were destroyed, and they are still facing a fatal delay in the payment of compensation that is demanded by about 10,000 employees.

The case sparked the anger of British public opinion after the report revealed that 13 post office officials committed suicide due to judicial prosecutions against them, west of London
The affected people lived in human suffering and severe pressure due to the prosecution (Al -Jazeera)

Prayer systems of error

Investigators did not rule out – according to the first part of the report – that officials of the Japanese technology company “Fujitsu” designed for the “Horizon” systems that cause the production of wrong data and other postal staff at the Knowledge of the Technical System Gaps before its launch, but they turned a blind eye to it, and insisted on the integrity of the data shown by technical programs.

The Japanese company “Fujitsu” for technological systems is one of the most important suppliers of the British government software for decades, and the British Financial Times indicates that political pressure due to the “Royal Postal Corporation” scandal prompted the British government to exclude the company from future government projects to automate public services.

But other British media revealed the government’s continued granting contracts to the company, also in the field of cloud computing And artificial intelligenceIt has reached 23 contracts since the beginning of last year, including the extension of an agreement with the post office itself of 63 million pounds.

This issue restores the controversy over the dangers of dealing with the huge data produced by digital systems, and viewing them as neutral data that are not wrong and using them are cut off to condemn institutions and individuals, at a time when it is increasingly relied on these technical programs to modernize public services in Britain.

The Emir of Al -Nimat did not rule out the head of the center Cyber security Artificial intelligence at the University of East London is that the companies supplying these systems be aware of the flaws that are marred, but out of commercial competition, they turn a blind eye, warning that these incidents may be repeated in the future with the increasing dependence on artificial intelligence and algorithms to manage government institutions in Britain.

Al -Nimat stresses – in an interview with Al -Jazeera Net – that this issue warns that technical systems can cause fatal errors, and that the programs that issue mechanical decisions may affect the fate of thousands of individuals must be guaranteed to their transparency and their ways to hold the authorities responsible for developing them.

The Royal Postal Corporation insisted on raising criminal cases against its employees, although there are doubts about the validity of the data
The Royal Postal Corporation insisted on raising criminal cases against its employees, despite doubts about the validity of data (Al -Jazeera)

Drink dangerous

Although the issue of the royal mail dates back to the beginning of the millennium, its repercussions – which have reached the level of the suicide of some of the accused due to the pressures they were exposed to – shed light on similar issues caused by the poor quality of data of government interests and their bias in legal violations such as a deprivation of social benefits, or the classification of people as a threat to public security, and the opening of wrong investigations against them.

It was Amnesty International (Amnesty), last February – called on the British police to stop the use of algorithms and the prediction systems of crime and targeting potential criminals, after it was proven that they were based on unfair data targeting the marginalized and the poor, warning that these technological systems may cause severe damage against those groups.

Lee Danacik, a researcher at the “Data Adalah” laboratory at the University of Cardiff, believes that the mechanical decisions should not be trusted because they are based on algorithm systems that are “black cans” that are difficult to know the foundations on which their decisions and calculations are taken, stressing the need to subject them to human control and scrutinize the results it provides.

The researcher warns of total dependence on huge data and algorithms of artificial intelligence systems because at the time – which these technological systems claim neutrality and accuracy – are interspersed with errors and defects that can result in fatal results.

Leave a Comment