Exploring the Complexities of Children’s Social Care Data
Monday, November 25, 2019
Analysis of aggregate administrative data
By Emily Buehler, Rees Centre.
This post is part of a series published by the Rees Centre on data (see below for related posts).
Secondary analysis of administrative datasets can provide key insight into children’s social care services’ practices and outcomes. The Department for Education collates data from a range of government sources that is made publicly available through the Local Authority Interactive Tool (LAIT). Over 500 data items at the local authority level are available and span a range of topics related to the safety and well-being of children and young people.
Indicators included in the LAIT tool can be used by researchers, analysts and practitioners to explore relationships and trends that can then be used to inform decision-making, resource-allocation, policies, and social work practices. With any research utilising secondary data, it is important to consider how outcomes of interest are defined and measured through the use of specific items. Integral to the process is the identification of an outcome, a decision on which data item indicators can be used to measure that outcome, and justification for why such an outcome is important in the given context (La Valle et al, 2019). However, in the case of children’s social care administrative data analysis, the analysis, findings and interpretation of findings are not always so straight forward. Constructing a narrative of change or progress through data can be vulnerable to biases in what indicators are chosen, how they are analysed, and how results are interpreted. It is, therefore, paramount to acknowledge the complexities of using this data to inform assessments about the qualities of local authorities.
The following illustration highlights how preliminary insights from analysis should be reviewed carefully and critically with regard to how they explain an outcome.
Perhaps I am interested in drawing conclusions about the stability of the social worker workforce in the Outer London region. This outcome is important to consider because workforce stability has implications for continuity of care, service effectiveness, and economic costs (La Valle et al, 2019; Hussein et al, 2015). The Children’s Commissioner Office produces an index of social worker stability; their research has found labour market indicators such as rates of turnover, vacancies and agency staff are the strongest predictors of a looked after child experiencing multiple social worker changes (Clarke, 2019). In this example, I will be exploring turnover rate. This rate is calculated by dividing the number of leavers by the number of social workers in place at year end (Department for Education, 2019). The following plot highlights the turnover rates for each Outer London authority at two time points: in 2014 and in 2018.
Local authorities at the top of this plot have had the most dramatic increases in social work staff turnover over the past four years and those at the bottom of the plot have largest decreases in rates of staff turnover. Shorter lines in between dots represent very small changes in these rates. In drawing conclusions back to the outcome of interest, workforce stability, it is useful to interrogate these findings a bit more. As such, it could be useful to compare the extremes of these changes. In Bromley, rates of turnover have decreased by nearly 20%. In Havering, turnover has increased by approximately the same amount. In explaining these changes and their potential impacts, it is useful to consider the journeys of both over this period of time.
Bromley’s children’s care services were judged Inadequate overall when inspected by Ofsted in June 2016 and made a substantial improvement at their most recent inspection, earning an overall Good rating in January of 2019. Similarly, Havering’s children’s social care services have improved from a rating of Requires Improvement to be Good in December 2016 to Good as of July 2018. In thinking about the improvement journeys of both local authorities, there may be several possible explanations for large discrepancies in turnover rates and what they may indicate in regard to the impact of workforce stability. High staff turnover in 2018 around the time of an improved rating could be associated with an organisational restructure where ineffective staff may have left the local authority. Low turnover at a time of improved rating could mean that stability in the workforce has led to more effective service delivery. Both explanations could be equally plausible.
This explanation may also be supported by examining the changes in Ofsted ratings for other local authorities in the region that have experienced large changes in rates of staff turnover in either direction. Waltham Forest, Hounslow, Brent, and Hillingdon are four other local authorities that had turnover rates of over 25% in 2014 that have since decreased by around 10%. All four were judged as requiring improvement during inspections that occurred in 2014-2015 but received an overall rating of Good at their most recent inspections in 2018-2019. Barnet has also experienced a shakeup of their staff, with turnover increasing by over 15% over this period. Like Havering, this coincides with an improvement in their overall Ofsted rating (moving from Inadequate in 2017 to Good in 2019). These patterns may suggest that both large changes in social work staff and a transition to a more consistent workforce can have beneficial effects on the local authority.
These initial findings can be supplemented by deeper explorations into annual trends of turnover or by making comparisons to other indicators relating to the stability of the workforce, such as the rates of vacancies or staff absences over the year. Turnover alone may not be an accurate measure of stability, and stability itself may not be a concept that always equates to the most effective services and good practices. It may be important to consider external factors that may influence the movement of social workers, such as neighbouring local authorities offering financial incentives or promises of lighter workloads (ADCS Safeguarding Pressures Phase 6 Main Report, 2018). The social worker workforce does not operate in a vacuum and the complexities of a constantly-changing social care system influence these measures of stability. Changes in policy, budgets and grant funding, and relationships with Ofsted under the new ILACS inspection framework place added pressures on local authorities. Given these limitations, analysis of aggregate administrative data should be acknowledged as an important first step on a research journey, but it should not be the last.
This blog post is written by Emily Buehler, Research Officer at the Rees Centre.
It is part of a series published by the Rees Centre on data.
Related network: Children’s Social Care Data User Group
The Children’s Social Care Data User Group (CSC DUG) was set up in 2017 by the Rees Centre, University of Oxford and Thomas Coram Research Unit, UCL. It is a network of academics, local authority data managers, analysts, charities and funders with a shared vision that administrative data from children’s social care and other relevant agencies in England can be analysed and fed back into policy and practice to improve the way that children’s social care services respond to children, young people and their families.
The group focuses on data submitted to and analysed by the Department for Education (namely the SSDA 903 children looked after data and the Children in Need Census data).
Membership is open to any individual or organisation who is using, or plans to use, children’s social care data collated by the Department for Education (Child in Need data, Looked After Children data or Section 251 social care expenditure data in relation to children’s services).
To join the group’s mailing list: email email@example.com