Will the UK’s housebuilding algorithm join the government’s growing AI graveyard?

There a number of dangers in using algorithms. Owl highlights a couple.

They need to be explained and not just transparent. Neil O’Brien MP published a formula to calculate an “adjustment factor” based on affordability. But what was the underlying logic? What was it supposed to be doing? Where did it come from? As a general rule if those who commissioned and created an algorithm cannot explain, in simple language ,what it is doing, then they themselves don’t understand how it works sufficiently for it to be used.

There has been a number of papers published in recent years suggesting that algorithms used in public service should undergo a rigorous testing process. Obviously the recent algorithms we are hearing about haven’t been thoroughly tested. – Owl 

Thomas Macaulay thenextweb.com 

It’s been a seriously rough few weeks for algorithms in the UK.

The problems started on August 7, when the British government scrapped an algorithm used in visa applications, following allegations that it was creating “speedy boarding for white people.”

Weeks later, England joined Scotland, Wales, and Northern Ireland in ditching a model used to calculate school exam results after evidence emerged that it had penalized poorer students.

The algorithms must have thought their month couldn’t get any worse. But in the last two days, they’ve been hit with another double dose of bad news.

Yesterday, the Guardian revealed that around 20 councils — local government authorities in the UK — have stopped using an algorithm to detect fraudulent welfare claims.

Researchers from the Data Justice Lab (CDJL) found that one algorithm was dumped after falsely flagging low-risk claims as high-risk, while another was dropped because it simply didn’t make a difference to the council’s work.

The CDJL also discovered that Sunderland council had scrapped a separate algorithm designed to make efficiency savings, while Hackney had ditched one that identified children at risk of abuse.

“Algorithmic and predictive decision systems are leading to a wide range of harms globally, and… a number of government bodies across different countries are pausing or canceling their use of these kinds of systems,” Dr Joanna Redden from the Data Justice Lab told the Guardian.

She might not have long to wait to add another to the list.

Meet the planning algorithm

The British government recently introduced a new formula for calculating where new housing is built. But planning consultancy Litchfields today claimed the algorithm would lead to more homes being constructed in the countryside and suburbs — typically Tory-voting areas — and fewer in towns and city centers.

The plans have achieved the rare feat of attracting critics from across the political spectrum.

Conservative MP Neil O’Brien warned Tory-voters wouldn’t want more housing where they live; Labour’s Kate Hollern accused the government of “leveling-down areas;” and the Green Party’s Natalie Benett said the plans would “step up regional inequality even further, and hack into the greenbelt for the benefit of mass housebuilders.” 

Whether the housebuilding algorithm joins the ones used for exam results and welfare claims on the shelf. But it will certainly be under increased scrutiny over the months to come.