Research project highlights dangers of undeclared, intrusive use of data by local authorities
What’s been said?
Professors Ros Edwards (University of Southampton) and Val Gillies (University of Westminster) are co-directing a very topical research project. Faced with a title like “Parental Social Licence for Data Linkage for Service Intervention,” busy home educators might be excused for thinking they’d give this one a miss, but the content is highly significant, timely and relevant to all parents.
The team’s 1 March blog post is headed “Why parents need to understand how their family’s data is being used,” and this short video animation provides an easy introduction to the researchers’ concerns.
Prof Ros Edwards explains:
“We believe that policy developments and data linkage and analytics practices to inform services interventions are moving ahead of public knowledge and consent. We should all take time to understand this better and consider what it means. We hope this video will help prompt discussions among parents… to make sure we are better informed and better equipped to challenge these processes if we think they need challenging.” [Emphasis added]
Another post from June last year details findings from a pilot study which the team undertook to explore parents’ views about linking information for family interventions. Worryingly, only half of the three hundred and sixty-five respondents had heard about data linkage and how it worked. But less than half thought it was acceptable to use it to improve the planning and delivery of family support services, and far fewer thought it should be used to identify specific families who “might need intervention but hadn’t asked for support or to save public money by preventing or catching family problems early.”
Other comments from participants included concerns about lack of trust, families’ right to privacy, Government surveillance without families’ knowledge or consent, increasing stigma, problems with data accuracy and safety, and many other factors
Do these themes sound at all familiar in the home educating context, where data collection and usage is a hotly-contested issue? There remains huge pressure for the establishment of mandatory Local Authority CNiS registers in England, whilst the Welsh government hopes to work towards databases of all children in Wales, using data sourced from local health boards.
Why does it matter?
Edwards’ words sound a very important warning, with her heads-up that policy and practice are “moving ahead of public knowledge and consent.” If people can’t keep up with developments or if practices are adjusted subtly without properly notifying those likely to be affected by the changes, what does that mean for consent? This pertains to all parents – in the video clip, Mr and Mrs Taylor’s children were in school.
It is then even more disturbing to look at the way software solutions provider Sentinel Partners promote their services to Local Authorities. Here are some of the grounds on which they market their scarily-titled “Single View of Citizen/Child” solution:
“Fragmented data holds back local authorities and their partner agencies as they seek to deliver impactful mainstream and targeted support services.
Our Single View Solutions overcome this by providing the most complete, accurate and up-to-date information about the individuals they serve.
Data from multiple sources, including external agencies, is integrated to form a single holistic view, enabling organisations to work together to direct resources where they are needed. Target cohorts are then identified using powerful profiling functionality based on criteria set and managed by local teams.
Applications include the identification and care of vulnerable adults, integrated care and an overview of service provision. The Single View of Child solution is optimised for a multi-agency approach where the focus is on the well-being and safeguarding of children….” [Emphasis added]
Couched, of course, only in terms of providing better services to those being served, the appeal is clearly maximised to outweigh concerns about any potential dangers of such a “powerful profiling” tool, or a decreased reliance on good, old-fashioned human common sense which used to resolve many a potential problem on an individual basis.
Chillingly, we learn that this “dynamic data software” already forms an integral part of many local authorities’ policy for delivering appropriate and effective “support.”
Scroll down the page for more features, then put yourself in the shoes of an overworked LA staff member with a heavy caseload, terrified of missing any indication of potential risk in today’s risk-averse climate.
Similarly, a ‘sponsored’ article published in Children & Young People Now (21 February) is actually an advertisement for Liquidlogic’s data processing product which is already being used by a number of LAs. Interestingly the article also supports Amanda Spielman’s call for, “a ‘proper register’ of children to be maintained for children who aren’t attending school.” The author, a company employee and one time social worker, later states:
“Liquidlogic has developed a joined-up solution across education, early help and social care which can automatically feed school information, including enrolment, attendance data and exclusions and suspensions on a daily basis to the practitioners that need to access it, as well as informing processes such as the Education, Health and Care Plan and for Looked After Children.”
Yes, we can understand the appeal of this technology, but it is frightening to witness a generation of public sector workers being trained to depend more upon data collated by algorithms than on their own intelligence.
As most parents know only too well, life is not binary – it is nuanced. Skill and careful consideration are therefore required to navigate one’s way wisely through its challenges. Many decisions cannot be determined through a binary approach, for both relationships and circumstances have inbuilt self-righting mechanisms as well as disaster-prone tendencies, and can sometimes include ‘near misses’ which never come to anything.
Allocating arbitrary ‘red flags’ to families at the prompting of an algorithm is a far less precise way of identifying genuine risk, and has great potential to generate false positives. Besides that being incredibly stressful for the individuals involved, these absorb precious staff time and resource, diverting them away from more vital and necessary tasks.
Defend Digital Me’s report ‘The State of Data 2020’ highlights the need for “a rights’ respecting environment in the digital landscape of state education in England.” It’s looking like there’s a similar need for those families outside state education too.
What can I do?
Watch the video and read some of the blog posts from the Southampton research project.
Look around both Sentinel’s and Liquidlogic’s websites, and be aware that these are increasingly indicative of the prevailing environment in which local authority staff work. Recognise also that the processing of data is now a very profitable business, whilst data itself is often described as “the new gold.”
Keep reminding yourself that behind the anonymity of jargon like ‘service users’ and ‘target cohorts’ lie individuals, real human beings living real life in all its glorious technicolour. Think through the dangers of using algorithms to inform decisions about families.
Remember too, knowledge is power, so do share these links with others The Southampton team want to ensure that parents are “…better equipped to challenge these processes if we think they need challenging.”
Given that home educating parents can find their families in the spotlight purely due to the choice they have made not to opt in to mainstream schooling, this is very applicable. A trip to A&E for a simple childhood mishap can contribute to a red flag if a health professional deems that HE is another red-flag-worthy factor.
If you think you are being inappropriately ‘profiled’ or ‘red flagged,’ take advice, speak up and push back.
And above all, remember that your personal data is yours. The more public bodies and agencies gain access to personal information, the more individuals lose control over data which hitherto has been regarded as theirs. We are looking at huge threats to privacy and liberty in family life, and when tied to artificial intelligence, it is extremely worrying. Give us the authentic article any day.