Automated systems that screen for signs that welfare claimants may commit fraud or negligence make that decision based in part on the applicant’s age and are increasingly calling for a review of whether the system is legal. I am.
Xantura, a British tech company that provided “risk-based validation” to about 80 councils and evaluated hundreds of thousands of claimants, did not provide algorithms with information protected by anti-discrimination law. I mentioned earlier.
However, its CEO, Wajid Shafiq, now acknowledges the plaintiff’s age. this is, 2010 Equality Act.. In other words, someone’s disadvantageous treatment based on that characteristic is equivalent to direct discrimination and can be illegal.
Xantura was able to talk to the Guardian after Big Brother Watch, a citizen’s free campaign group, obtained a cache of documents under the Information Disclosure Act, to get a glimpse of how Xantura’s system works. ..
It automatically recommends claimants of higher-risk housing and parliamentary tax incentives for stricter checks that can lead to delays in decisions. We will also accelerate the application of applicants who are considered to be low risk.
Xantura argues that the use of age helps reduce fraud and error, speeds up most of the application, and cites the law to not violate equality law. exception Allow financial service providers to explain their age. However, campaign participants are seeking a more detailed investigation.
Documents released on the BBB show that Xantura has processed data about where people live, including the ethnic composition of their neighborhood.
Gender and race are protected traits, but Shafiq said, “Except for age, we do not use other protected traits.” He said information about neighborhood and gender was only used to see if the system was operating in a biased way after the decision was made.
He refused to see what other personal information was entered into the algorithm and said it could allow the claimant to play the system, but was provided by the claimant. He said the information could be used to prevent fraud and errors.
Asked if the algorithm predicts that older or younger people are more likely to commit fraud or error, he replied.[It is] It’s not that simple. Because this is a multivariate model, different combinations of risk factors must be present to generate false fraud and claims. “
He previously stated that “the RBV model does not use protected properties.”
Xantura is one of several companies that help automate welfare systems, but the behavior of “welfare robots” is hidden in secret. The complainant is not informed that the application is subject to algorithmic decision making, raising concerns about its impact.
According to a document released to BGP, Xantura in the 2012 Confidential “Model Overview” included variables “considered to be statistically significant” in a wide range of categories that partially reflected ethnic composition. Said that it includes the types of areas inhabited that are defined. At that time, the groups defined by these National Bureaus of Statistics included an “ethnocentrism” that represented places where more non-whites than the British average commonly appeared. “Especially mixed ethnic or black”.
The 2017 RBV User Guide, published by the Salford City Council by Xantura’s business partner Northgate, includes 66 “specific data requested by Xantura” to calculate risk scores such as gender, age, and disability. Is listed.
Shafiq states that “the document was created incorrectly” and not all of these factors were used to determine the risks posed by the complainant.
“This is the difference between the RBV system and the RBV model, and these documents need to make the distinction clearer. There was a misunderstanding.”
Some of Xantura’s local government clients do not believe that the system “affects equality” because they do not use protected traits such as age, gender, race, or disability, officially stated. I am. Xantura provides the client council with draft templates for implementing RBV policies, such as performance reports and sign-offs.
“In our experience, clients use our draft policies to develop their own policies,” says Shafiq.
“We have a duty to prevent fraud and errors,” he said. “If the local government decides that age should not be used in the modeling process, it can be removed.”
Jake Harfert, Head of Research and Investigation at BGP, said: .. “
Profit claimant Andy Mitchell, who assists in other people’s applications, said: “Usually all targeted groups are attacked again by these algorithms. They are the poorest, voiceless people in society.”
Robin Allen QC, a discrimination lawyer who runs AI Law Consultancy, said: Age is not an honest surrogate and should not normally be used that way. “
Shafiq defends this system, saying: “It’s perfectly appropriate to use age, and it’s perfectly appropriate to use other fields as well. [a claimant] Accessories can be used to prevent fraud and errors.
Northgate is part of Japan’s tech giant NEC, and Xantura’s products are integrated with its revenue and benefits system. NEC Software Solutions said: “We are not involved in the definition of these criteria at all.”
Call for a legal review of the UK welfare screening system considering age | Benefits
Source link Call for a legal review of the UK welfare screening system considering age | Benefits