Skip to content

Department of Education

Viewing archives for local authority

By Jean Mallo, Performance Manager, Children’s Services Performance Team, Wandsworth Borough Council and co-chair of LIEG (London Information Exchange Group).

This post is part of a series published by the Rees Centre on data (see below for related posts).

The five structural and formatting barriers to turning numbers into intelligence, or Data to Insight

Local authority resources are dwindling. There is an increased demand on children’s services to make better and quicker decisions to ensure that the right action is taken promptly to improve the lives of children and families. As a result, there is a renewed pressure on performance teams to turn numbers into intelligence, or Data to Insight.

Each manual step taken to manipulate data into a usable structure and format before analysis can result in human error. More critically it leads to duplicated effort in 151 local authorities across England. To put this into perspective, if a table of published data requires just 5 minutes of manual preparation, then almost two days’ of an analyst post across England was spent on formatting rather than on understanding or learning.

The five most common types of structural and formatting problems taking up analysts’ time:

Hidden or blank rows and columns

What you see is not what you get. Copying what appears to be a simple five-by-five table results in a messy muddle that requires manual deletion of rows and/or columns and unmerging cells before it can be used in reports or used as underlying data for analysis.

Numbers and dates stored as text or strings

For a number or a date to become insight it needs to be understood in the wider context. For example – is the number high or low, is there an upward trend or a decline, what is the duration between the two dates? Numbers and dates reported as text or string need to be changed before these questions can be answered.

Dealing with missing data

Turning what appears to be ‘no data’ into genuine ‘no data’ can be a time-consuming task – a space may take the place of what appears to be an empty cell, or it may be filled with a letter or a symbol.  The rules change from one source to another, sometimes even within the same organisation or publication.

Machine readable is not always human readable

On the one hand, the ‘tidy data format’ is an analyst’s dream for converting numbers into tables and visualisations using the latest software and technology. On the other, the standard table format is easy to read and understand not only by local authority analysts but also by the public.

Inconsistencies between publications

Regardless of the format or structure chosen by an organisation, the one thing that carries the most weight is the consistency between publications. The children’s social care sector has developed its own data warehouses and analysis tools. Each time the structures and formats change, this is a further burden on local authorities to rewrite the code to manipulate the data back into the required structure.

The first three barriers have been solved in other fields and their solutions are starting to attract the attention of children’s services. There are common standards on how data should be defined, organised and described, and the access of information through Application Programming Interfaces (APIs) proves to be more consistent than spreadsheets. We know that the Department of Education (DfE) are looking at both solutions, although children’s social care data appears to be behind education data in the queue for improvements. Whilst we wait for these to be implemented, there could be merit in developing and sharing scripts that normalise data to help with the automation of cleaning up issues such as inconsistent approaches to dates and missing data.

By contrast, inconsistencies between publications feels like a challenge to be met with intelligent and inclusive partnership working rather than technology. Increasingly, we see organisations who provide data or define data standards, such as the DfE and Ofsted, starting to recognise that seemingly minor decisions about how to label and organise data can have very significant impacts for local government. We have seen an openness to involve children’s services in those decisions to show them how best to help us, and to work collaboratively. Formalising and expanding those arrangements could open the possibility of saving substantial amounts of time for local government staff, and also enable a new generation of analytical tools to be built upon more predictable and comparable data.

The improvements in technology and partnership working continue to drive the change to centralised data manipulation, the benefit of this is that children’s services can focus on providing valuable and insightful local analysis that makes a real difference to the decisions of children’s services in their area.

Jean Mallo, Performance Manager Children’s Services Performance Team Wandsworth Borough Council and co-chair of LIEG (London Information Exchange Group)

This post is part of a series published by the Rees Centre on data. The Rees Centre welcomes guest blog posts from professionals across the sector. Views expressed are the authors’ own and do not represent those of the Rees Centre.

Related posts:

Using data tools in local authorities children’s services 

Exploring the complexities of children’s social care data

Related network:

Children’s Social Care Data User Group

The Children’s Social Care Data User Group (CSC DUG) was set up in 2017 by the Rees Centre, University of Oxford and Thomas Coram Research Unit, UCL. It is a network of academics, local authority data managers, analysts, charities and funders with a shared vision that administrative data from children’s social care and other relevant agencies in England can be analysed and fed back into policy and practice to improve the way that children’s social care services respond to children, young people and their families.

The group focuses on data submitted to and analysed by the Department for Education (namely the SSDA 903 children looked after data and the Children in Need Census data).

Membership is open to any individual or organisation who is using, or plans to use, children’s social care data collated by the Department for Education (Child in Need data, Looked After Children data or Section 251 social care expenditure data in relation to children’s services).

To join the group’s mailing list: email rees.centre@education.ox.ac.uk

Analysis of aggregate administrative data

By Emily Buehler, Rees Centre.

This post is part of a series published by the Rees Centre on data (see below for related posts).

Secondary analysis of administrative datasets can provide key insight into children’s social care services’ practices and outcomes. The Department for Education collates data from a range of government sources that is made publicly available through the Local Authority Interactive Tool (LAIT).  Over 500 data items at the local authority level are available and span a range of topics related to the safety and well-being of children and young people.

Indicators included in the LAIT tool can be used by researchers, analysts and practitioners to explore relationships and trends that can then be used to inform decision-making, resource-allocation, policies, and social work practices. With any research utilising secondary data, it is important to consider how outcomes of interest are defined and measured through the use of specific items. Integral to the process is the identification of an outcome, a decision on which data item indicators can be used to measure that outcome, and justification for why such an outcome is important in the given context (La Valle et al, 2019). However, in the case of children’s social care administrative data analysis, the analysis, findings and interpretation of findings are not always so straight forward. Constructing a narrative of change or progress through data can be vulnerable to biases in what indicators are chosen, how they are analysed, and how results are interpreted. It is, therefore, paramount to acknowledge the complexities of using this data to inform assessments about the qualities of local authorities.

The following illustration highlights how preliminary insights from analysis should be reviewed carefully and critically with regard to how they explain an outcome.

Perhaps I am interested in drawing conclusions about the stability of the social worker workforce in the Outer London region. This outcome is important to consider because workforce stability has implications for continuity of care, service effectiveness, and economic costs (La Valle et al, 2019; Hussein et al, 2015). The Children’s Commissioner Office produces an index of social worker stability; their research has found labour market indicators such as rates of turnover, vacancies and agency staff are the strongest predictors of a looked after child experiencing multiple social worker changes (Clarke, 2019). In this example, I will be exploring turnover rate. This rate is calculated by dividing the number of leavers by the number of social workers in place at year end (Department for Education, 2019). The following plot highlights the turnover rates for each Outer London authority at two time points: in 2014 and in 2018.

 

Local authorities at the top of this plot have had the most dramatic increases in social work staff turnover over the past four years and those at the bottom of the plot have largest decreases in rates of staff turnover. Shorter lines in between dots represent very small changes in these rates. In drawing conclusions back to the outcome of interest, workforce stability, it is useful to interrogate these findings a bit more. As such, it could be useful to compare the extremes of these changes. In Bromley, rates of turnover have decreased by nearly 20%. In Havering, turnover has increased by approximately the same amount. In explaining these changes and their potential impacts, it is useful to consider the journeys of both over this period of time.

Bromley’s children’s care services were judged Inadequate overall when inspected by Ofsted in June 2016 and made a substantial improvement at their most recent inspection, earning an overall Good rating in January of 2019.  Similarly, Havering’s children’s social care services have improved from a rating of Requires Improvement to be Good in December 2016 to Good as of July 2018. In thinking about the improvement journeys of both local authorities, there may be several possible explanations for large discrepancies in turnover rates and what they may indicate in regard to the impact of workforce stability. High staff turnover in 2018 around the time of an improved rating could be associated with an organisational restructure where ineffective staff may have left the local authority. Low turnover at a time of improved rating could mean that stability in the workforce has led to more effective service delivery. Both explanations could be equally plausible.

This explanation may also be supported by examining the changes in Ofsted ratings for other local authorities in the region that have experienced large changes in rates of staff turnover in either direction. Waltham Forest, Hounslow, Brent, and Hillingdon are four other local authorities that had turnover rates of over 25% in 2014 that have since decreased by around 10%. All four were judged as requiring improvement during inspections that occurred in 2014-2015 but received an overall rating of Good at their most recent inspections in 2018-2019. Barnet has also experienced a shakeup of their staff, with turnover increasing by over 15% over this period. Like Havering, this coincides with an improvement in their overall Ofsted rating (moving from Inadequate in 2017 to Good in 2019). These patterns may suggest that both large changes in social work staff and a transition to a more consistent workforce can have beneficial effects on the local authority.

These initial findings can be supplemented by deeper explorations into annual trends of turnover or by making comparisons to other indicators relating to the stability of the workforce, such as the rates of vacancies or staff absences over the year. Turnover alone may not be an accurate measure of stability, and stability itself may not be a concept that always equates to the most effective services and good practices. It may be important to consider external factors that may influence the movement of social workers, such as neighbouring local authorities offering financial incentives or promises of lighter workloads (ADCS Safeguarding Pressures Phase 6 Main Report, 2018). The social worker workforce does not operate in a vacuum and the complexities of a constantly-changing social care system influence these measures of stability. Changes in policy, budgets and grant funding, and relationships with Ofsted under the new ILACS inspection framework place added pressures on local authorities. Given these limitations, analysis of aggregate administrative data should be acknowledged as an important first step on a research journey, but it should not be the last.

This blog post is written by Emily Buehler, Research Officer at the Rees Centre.

It is part of a series published by the Rees Centre on data.

Related posts:

Using data tools in local authority children’s services 

Data in the right format

Related network: Children’s Social Care Data User Group

The Children’s Social Care Data User Group (CSC DUG) was set up in 2017 by the Rees Centre, University of Oxford and Thomas Coram Research Unit, UCL. It is a network of academics, local authority data managers, analysts, charities and funders with a shared vision that administrative data from children’s social care and other relevant agencies in England can be analysed and fed back into policy and practice to improve the way that children’s social care services respond to children, young people and their families.

The group focuses on data submitted to and analysed by the Department for Education (namely the SSDA 903 children looked after data and the Children in Need Census data).

Membership is open to any individual or organisation who is using, or plans to use, children’s social care data collated by the Department for Education (Child in Need data, Looked After Children data or Section 251 social care expenditure data in relation to children’s services).

To join the group’s mailing list: email rees.centre@education.ox.ac.uk

This insight piece by Dr Lisa Holmes for the Nuffield Family Justice Observatory explores the nature, availability and use of child level administrative data at the local and regional level by children’s social care departments.

“It is evident that there is value in children’s social care data being analysed locally and regionally, as well as purely being exported by local authorities to meet the statutory reporting requirements, but that the capacity and capability of performance management teams impacts on the ease and frequency that this can take place. “

Use of these data are considered within the context of requirements to submit annual administrative data to the Department for Education. Consideration is also given to issues related to the capacity of local authority performance management teams which have been drastically reduced in many local authorities as a result of austerity.

Finally consideration is given to ways that local authorities may share learning and increase capacity and capability.

Full report Nuffield Family Justice Observatory 21/10/2019