MADISON, Wis. — Researchers at a Wisconsin university have been awarded a $1 million grant to develop a tool to find and fix algorithmic bias.

University of Wisconsin-Madison researchers will use the National Science Foundation grant to work on a tool called FairSquare, which will be able to detect and automatically fix biases in software and algorithms, The Wisconsin State Journal ( ) reported.

Computer software increasingly is being used by private companies and government offices to make decisions on hiring, bank loans and prison sentences.

While computers are designed to be logical, growing evidence shows that programing can cause them to deliver decisions that are prejudiced and incorrect, said Aws Albarghouthi, a computer science professor at the university. A big problem scientists face is how to define fairness or unfairness.

“As we move deeper into the 21st century, the question of correctness becomes fuzzier and fuzzier because computer programs are doing sensitive tasks whose correctness is not well-defined, but is a debatable ethical, philosophical or moral question,” Albarghouthi said.

The grant is expected to help the team take their existing prototype and create a finished product.

The state Department of Corrections uses an algorithm called COMPAS to compare the records of convicts with similar characteristics and histories and evaluate the risk of convicts becoming repeat offenders. The assessments are used in pre-sentencing reports to judges, to classify inmates and to plan their releases.

Last year Eric Loomis went to court over the program, saying he was denied due process rights. He also said he was sentenced unfairly because the company that created COMPAS doesn’t reveal how the algorithm works, making it unclear how different factors are weighed.

The state Supreme Court ruled against Loomis because the assessment wasn’t the sole factor in his sentencing, but ordered the corrections department to add a disclaimer to pre-sentencing reports.

Information from: Wisconsin State Journal,