Sign Up
..... Connect Australia with the world.
Categories

Posted: 2021-11-30 05:54:32

The best known AI failure in Australia was Centrelink’s automated income compliance regime known as robo-debt which collected debts from welfare recipients for overpayments. In fact, the computer program which calculated people’s income and welfare entitlement was flawed and illegal. It demanded repayment from vulnerable people of money which was rightfully paid to them. Centrelink had to pay $1.2 billion to 400,000 people in settlement of a class action.

Loading

The Ombudsman said the general lesson of robo-debt was that governments have to be much more careful when using AI because it can affect so many people and it can be so hard to correct using traditional mechanisms of redress such as complaining to the Ombudsman or an administrative tribunal. Often the only way to resolve problems is an expensive overhaul of software.

Getting it right is all the more important because, as the Ombudsman says, “machine technology is disproportionately used in areas that affect the most vulnerable in society – in areas such as policing, healthcare, welfare eligibility, risk scoring and fraud detection”.

It is, of course, not just in government that algorithms are getting a bad name. Social media companies such as Facebook and Google have been criticised for using machine learning techniques to curate the information sent down people’s feeds in ways that distort democracy. Banks use it in issuing loans.

But governments have to uphold the highest standards of fairness and legality. As the Ombudsman argues, as a start people should be told when a decision is being taken by a machine rather than a person. The government has to use artificial intelligence with extreme care and greater transparency.

The Herald editor writes a weekly newsletter exclusively for subscribers. To have it delivered to your inbox, please sign up here.

View More
  • 0 Comment(s)
Captcha Challenge
Reload Image
Type in the verification code above