A computer system intended to help Thurrock Council predict if a child is at risk of abuse could soon be used to predict if a resident will become homeless or take part in anti-social behaviour.
Thurrock has been using council data and analytics technology to develop what they call a “predictive modelling platform”, which has helped to identify children who could be at risk so that social services can intervene.
The council began using the system, developed by a private company called Xantura, four years ago to cut costs and allow services to identify cases early, before they become more complex.
The latest report on the system, published in July states that analytics are responsible for identifying 100 percent of families referred to the council’s troubled families programme and it is also being used to provide alerts to safeguarding teams with an 80 percent success rate.
It is estimated that by 2020, the council will have spent £1.14million on the system.
The council hopes to expand it further to make predictions on anti-social behaviour, homelessness and debt collections. Predictions will be based on a range of data including school attendance records, domestic abuse history, youth offending records and economic indicators.
This form of profiling inevitably brings a range of ethical and privacy concerns but the council failed to comment on the use of the technology.
Wajid Shafiq, chief executive of Xanthura, said it is not about categorising people or accusing them of future crimes.
“The alerts won’t be generated unless there is a risk these people are already involved in something like anti-social behaviour,” he said.
“To assess that we may use data such as an increase in truancy and if there is already parenting problems but it is very pared back what we can actually do. If someone is already engaged with the youth offending teams then a youth offending officer will see this as an additional report.”
Mr Shafiq also stressed that personal data is not being shared and Xanthura does not make recommendation on what action the council should take once they receive the assessment.
However, the civil liberties organisation Big Brother Watch branded the system a “terrible idea”.
A spokesperson said: “We are alarmed to learn that councils are using predictive systems that invade families’ private lives and make stigmatising assumptions about them. It is astounding that councils have managed to advance such a terrible idea so far.
“This seriously risks profiling families and casting suspicion over their parenting abilities on the basis of high-tech stereotyping.”
Councillor Sue Little, portfolio holder for Children and Adult Social Care, said: “It is important to emphasise that data analytics systems are only part of the process and further verification and checks are carried out prior to any intervention.
“At a time when demand for children’s social care nationally is rising, the use of data analytics provides us with an opportunity to better identify those most in need of help and support and enables us to reduce the need for statutory interventions, which can be distressing for families.
“We are satisfied that our profiling systems are compliant with data protection legislation and best practice.”
Comments: Our rules
We want our comments to be a lively and valuable part of our community - a place where readers can debate and engage with the most important local issues. The ability to comment on our stories is a privilege, not a right, however, and that privilege may be withdrawn if it is abused or misused.
Please report any comments that break our rules.
Read the rules hereLast Updated:
Report this comment Cancel