British Grading Debacle Shows Pitfalls of Automating Government

LONDON — Even after a final term with schools closed for the pandemic, Sam Sharpe-Roe was optimistic about the coming school year. Teachers from his West London school had given him grades — three A’s and one B — that were strong enough to secure him a spot at his first choice of university next month.

But after the British government used a computer-generated score to replace exams that were canceled because of the coronavirus, all his grades fell and the college revoked his admission.

Mr. Sharpe-Roe, along with thousands of other students and parents, had received a crude lesson in what can go wrong when a government relies on an algorithm to make important decisions affecting the public.

Experts said the grading scandal was a sign of debates to come as Britain and other countries increasingly use technology to automate public services, arguing that it can make government more efficient and remove human prejudices.

But critics say the opaque systems often amplify biases that already exist in society and are typically adopted without sufficient debate, faults that were put on clear display in the grading disaster.

Nearly 40 percent of students in England saw their grades reduced after the government re-evaluated the exams, known as A-levels, with the software model. It included in its calculations a school’s past performance on the tests and a student’s earlier results on “mock” exams.

Government officials said the model was meant to make the system more fair, balancing out potentially inflated scores given by some teachers. But students and their parents, particularly those from lower-income areas with struggling schools, were outraged that their futures had been turned over to lines of code that favored students from private schools and wealthy areas.

#styln-briefing-block { font-family: nyt-franklin,helvetica,arial,sans-serif; background-color: #ffffff; color: #121212; box-sizing: border-box; margin: 30px auto; max-width: 510px; width: calc(100% – 40px); border-top: 5px solid #121212; border-bottom: 2px solid #121212; padding: 5px 0 10px 0; } @media only screen and (min-width: 600px) { #styln-briefing-block { margin: 40px auto; } } #styln-briefing-block a { color: #121212; } #styln-briefing-block ul { margin-left: 15px; } #styln-briefing-block a.briefing-block-link { color: #121212; border-bottom: 1px solid #cccccc; font-size: 0.9375rem; line-height: 1.375rem; } #styln-briefing-block a.briefing-block-link:hover { border-bottom: none; } #styln-briefing-block .briefing-block-bullet::before { content: ‘•’; margin-right: 7px; color: #333; font-size: 12px; margin-left: -13px; top: -2px; position: relative; } #styln-briefing-block .briefing-block-bullet:not(:last-child) { margin-bottom: 0.75em; } #styln-briefing-block .briefing-block-header-section { margin-bottom: 16px; } #styln-briefing-block .briefing-block-header { font-weight: 700; font-size: 1.125rem; line-height: 1.375rem; display: inline-block; margin-bottom: 5px; } @media only screen and (min-width: 600px) { #styln-briefing-block .briefing-block-header { font-size: 1.25rem; line-height: 1.5625rem; } } #styln-briefing-block .briefing-block-header a { text-decoration: none; color: #333; } #styln-briefing-block .briefing-block-header a::after { content: ‘›’; position: relative; font-weight: 500; margin-left: 5px; } #styln-briefing-block .briefing-block-footer { font-size: 14px; margin-top: 1.25em; /* padding-top: 1.25em; border-top: 1px solid #e2e2e2; */ } #styln-briefing-block .briefing-block-briefinglinks a { font-weight: bold; margin-right: 6px; } #styln-briefing-block .briefing-block-footer a { border-bottom: 1px solid #ccc; } #styln-briefing-block .briefing-block-footer a:hover { border-bottom: 1px solid transparent; } #styln-briefing-block .briefing-block-header { border-bottom: none; } #styln-briefing-block .briefing-block-lb-items { display: grid; grid-template-columns: auto 1fr; grid-column-gap: 20px; grid-row-gap: 15px; line-height: 1.2; } #styln-briefing-block .briefing-block-update-time a { color: #999; font-size: 12px; } #styln-briefing-block .briefing-block-update-time.active a { color: #D0021B; } #styln-briefing-block .briefing-block-footer-meta { display: none; justify-content: space-between; align-items: center; } #styln-briefing-block .briefing-block-ts { color: #D0021B; font-size: 12px; display: block; } @media only screen and (min-width: 600px) { #styln-briefing-block a.briefing-block-link { font-size: 1.0625rem; line-height: 1.5rem; } #styln-briefing-block .briefing-block-bullet::before { content: ‘•’; margin-right: 10px; color: #333; font-size: 12px; margin-left: -15px; top: -2px; position: relative; } #styln-briefing-block .briefing-block-update-time a { font-size: 13px; } } @media only screen and (min-width: 1024px) { #styln-briefing-block { width: 100%; } }

Updated 2020-08-20T16:53:20.363Z

Even after the government apologized and threw out the computer scores, many students had already lost their slots at their preferred universities, sending the admission process into further chaos.

“These algorithms are obviously not correct,” said Mr. Sharpe-Roe, 18, whose home borough of Ealing is enormously diverse but also divided by race, ethnicity and income. “I know a load of other people who are in a similar situation.”

ImageStudents and parents were outraged that the algorithm used to adjust exam scores favored students from private schools and wealthy areas.
Credit…Peter Nicholls/Reuters

The outcome, experts say, was entirely predictable. In fact, the Royal Statistical Society had for months warned the test administration agency, Ofqual, that the model was flawed.

“It’s government trying to emulate Silicon Valley,” said Christiaan van Veen, director of the digital welfare state and human rights project at New York University. “But the public sector is completely different from private companies.”

As an investigator for the United Nations, Mr. van Veen studies how Britain and other countries use computers to automate social services. He said the techniques were being applied to policing and court sentencing, health care, immigration, social welfare and more. “There are no areas of government that are exempt from this trend,” he said.

Britain has been particularly aggressive in adopting new technology in government, often with mixed results. Earlier this month, the government said it would stop using an algorithm for weighing visa applications after facing a legal complaint that this was discriminatory. A few days later, a British court ruled against the use of some facial-recognition software by the police.

The country’s automated welfare system, Universal Credit, has faced years of criticism, including from the United Nations, for making it harder for some citizens to obtain unemployment benefits. Britain’s contact-tracing app, which the government had said would be key to containing the coronavirus, has been delayed by technical problems.

“There is an idea that if it has an algorithm attached to it, it’s novel and interesting and different and innovative, without understanding what those things could be doing,” said Rachel Coldicutt, a technology policy expert in London who is working on a book about responsible innovation.

Those who have called for more scrutiny of the British government’s use of technology said the testing scandal was a turning point in the debate, a vivid and easy-to-understand example of how software can affect lives.

Cori Crider, a lawyer at Foxglove, a London-based law firm that filed a complaint against the grading algorithm, said the problem was not the use of technology itself but the lack of transparency. Little is known about how the models work before they are introduced.

“There has been a tendency to compute first and ask questions later,” said Ms. Crider, who also brought the legal challenge against the visa algorithm. “There’s been a refusal to have an actual debate about how these systems work and whether we want them at all.”

Credit…Justin Tallis/Agence France-Presse — Getty Images

For years, Britain has heralded technology as a way to modernize government and provide social services more efficiently. The trend has spanned several administrations but has been given fresh momentum under Prime Minister Boris Johnson.

His top political adviser, Dominic Cummings, has argued forcefully that Silicon Valley thinking is needed to create high performance government, including new workers in areas like data science and artificial intelligence. He has expressed admiration for the “frontiers of the science of prediction.”

.css-1wxds7f{margin-bottom:10px;font-family:nyt-franklin,helvetica,arial,sans-serif;font-weight:700;font-size:0.875rem;line-height:1.25rem;color:#333 !important;}.css-2al2sh{font-family:nyt-franklin,helvetica,arial,sans-serif;font-size:0.9375rem;line-height:1.25rem;color:#333;margin-bottom:0.78125rem;margin-top:20px;margin-bottom:5px;font-weight:700;}@media (min-width:740px){.css-2al2sh{font-size:1.0625rem;line-height:1.5rem;margin-bottom:0.9375rem;}}@media (min-width:740px){.css-2al2sh{margin-bottom:10px;}}.css-1yyoic1{font-family:nyt-franklin,helvetica,arial,sans-serif;font-size:0.9375rem;line-height:1.25rem;color:#333;margin-bottom:0.78125rem;}@media (min-width:740px){.css-1yyoic1{font-size:1.0625rem;line-height:1.5rem;margin-bottom:0.9375rem;}}.css-zkk2wn{margin-bottom:20px;font-family:nyt-franklin,helvetica,arial,sans-serif;font-size:0.875rem;line-height:1.5625rem;color:#333;}.css-1dvfdxo{margin:10px auto 0px;font-family:nyt-franklin,helvetica,arial,sans-serif;font-weight:700;font-size:1.125rem;line-height:1.5625rem;color:#121212;}@media (min-width:740px){.css-1dvfdxo{font-size:1.25rem;line-height:1.875rem;}}.css-16ed7iq{width:100%;display:-webkit-box;display:-webkit-flex;display:-ms-flexbox;display:flex;-webkit-align-items:center;-webkit-box-align:center;-ms-flex-align:center;align-items:center;-webkit-box-pack:center;-webkit-justify-content:center;-ms-flex-pack:center;justify-content:center;padding:10px 0;background-color:white;}.css-pmm6ed{display:-webkit-box;display:-webkit-flex;display:-ms-flexbox;display:flex;-webkit-align-items:center;-webkit-box-align:center;-ms-flex-align:center;align-items:center;}.css-pmm6ed > :not(:first-child){margin-left:5px;}.css-5gimkt{font-family:nyt-franklin,helvetica,arial,sans-serif;font-size:0.8125rem;font-weight:700;-webkit-letter-spacing:0.03em;-moz-letter-spacing:0.03em;-ms-letter-spacing:0.03em;letter-spacing:0.03em;text-transform:uppercase;color:#333;}.css-5gimkt:after{content:’Collapse’;}.css-rdoyk0{-webkit-transition:all 0.5s ease;transition:all 0.5s ease;-webkit-transform:rotate(180deg);-ms-transform:rotate(180deg);transform:rotate(180deg);}.css-eb027h{max-height:5000px;-webkit-transition:max-height 0.5s ease;transition:max-height 0.5s ease;}.css-6mllg9{-webkit-transition:all 0.5s ease;transition:all 0.5s ease;position:relative;opacity:0;}.css-6mllg9:before{content:”;background-image:linear-gradient(180deg,transparent,#ffffff);background-image:-webkit-linear-gradient(270deg,rgba(255,255,255,0),#ffffff);height:80px;width:100%;position:absolute;bottom:0px;pointer-events:none;}#masthead-bar-one{display:none;}#masthead-bar-one{display:none;}.css-19mumt8{background-color:white;margin:30px 0;padding:0 20px;max-width:510px;}@media (min-width:740px){.css-19mumt8{margin:40px auto;}}.css-19mumt8:focus{outline:1px solid #e2e2e2;}.css-19mumt8 a{color:#326891;-webkit-text-decoration:none;text-decoration:none;border-bottom:2px solid #ccd9e3;}.css-19mumt8 a:visited{color:#333;-webkit-text-decoration:none;text-decoration:none;border-bottom:2px solid #ddd;}.css-19mumt8 a:hover{border-bottom:none;}.css-19mumt8[data-truncated] .css-rdoyk0{-webkit-transform:rotate(0deg);-ms-transform:rotate(0deg);transform:rotate(0deg);}.css-19mumt8[data-truncated] .css-eb027h{max-height:300px;overflow:hidden;-webkit-transition:none;transition:none;}.css-19mumt8[data-truncated] .css-5gimkt:after{content:’See more’;}.css-19mumt8[data-truncated] .css-6mllg9{opacity:1;}.css-a8d9oz{border-top:5px solid #121212;border-bottom:2px solid #121212;margin:0 auto;padding:5px 0 0;overflow:hidden;}

In response to the coronavirus, Britain has sought help from companies like Palantir, a Silicon Valley analytics firm that was hired to manage data for the country’s National Health Service. A London-based artificial intelligence firm, Faculty, is working on predictive systems to help track the virus.

In another embarrassing misstep, the government decided to build its own contact-tracing app rather than using technical standards set by Apple and Google, despite warnings that it would have limitations. The release has been delayed for months.

Britain is not alone in turning some decisions over to computer systems. In the United States, algorithms are used by police departments to determine where officers patrol and by courts to set prison sentences. In Spain, the monitoring group Algorithm Watch identified a system being used to predict households at risk of domestic violence. The Netherlands abandoned the use of a system to detect welfare fraud after a judge said it was unlawful.

The techniques are often pitched as apolitical, but researchers say they disproportionately affect lower-income and minority groups.

“One of the great benefits of these tools for governments is it allows them to portray the decisions they are making as neutral and objective, as opposed to moral decisions,” said Virginia Eubanks, an associate professor at the State University of New York at Albany, whose book, “Automating Inequity,” explores the topic.

In Britain, the political fallout of the grading mishap dominated the news and led to calls for the country’s education minister to resign. Students protested outside Parliament, chanting expletives at “the algorithm.”

Critics say the experience shows the risks ahead as more sophisticated tools like artificial intelligence become available and companies pitch them to public agencies.

Mr. Sharpe-Roe said “there’s a lot of anger” at having his fate set by an algorithm. After struggling to regain his lost spot at college, he decided to defer for a year to work.

Source

Be the first to comment

Leave a Reply

Your email address will not be published.


*


18 − 10 =