Department of Education

Viewing archives for children’s social care

By Jean Mallo, Performance Manager, Children’s Services Performance Team, Wandsworth Borough Council and co-chair of LIEG (London Information Exchange Group).

This post is part of a series published by the Rees Centre on data (see below for related posts).

The five structural and formatting barriers to turning numbers into intelligence, or Data to Insight

Local authority resources are dwindling. There is an increased demand on children’s services to make better and quicker decisions to ensure that the right action is taken promptly to improve the lives of children and families. As a result, there is a renewed pressure on performance teams to turn numbers into intelligence, or Data to Insight.

Each manual step taken to manipulate data into a usable structure and format before analysis can result in human error. More critically it leads to duplicated effort in 151 local authorities across England. To put this into perspective, if a table of published data requires just 5 minutes of manual preparation, then almost two days’ of an analyst post across England was spent on formatting rather than on understanding or learning.

The five most common types of structural and formatting problems taking up analysts’ time:

Hidden or blank rows and columns

What you see is not what you get. Copying what appears to be a simple five-by-five table results in a messy muddle that requires manual deletion of rows and/or columns and unmerging cells before it can be used in reports or used as underlying data for analysis.

Numbers and dates stored as text or strings

For a number or a date to become insight it needs to be understood in the wider context. For example – is the number high or low, is there an upward trend or a decline, what is the duration between the two dates? Numbers and dates reported as text or string need to be changed before these questions can be answered.

Dealing with missing data

Turning what appears to be ‘no data’ into genuine ‘no data’ can be a time-consuming task – a space may take the place of what appears to be an empty cell, or it may be filled with a letter or a symbol.  The rules change from one source to another, sometimes even within the same organisation or publication.

Machine readable is not always human readable

On the one hand, the ‘tidy data format’ is an analyst’s dream for converting numbers into tables and visualisations using the latest software and technology. On the other, the standard table format is easy to read and understand not only by local authority analysts but also by the public.

Inconsistencies between publications

Regardless of the format or structure chosen by an organisation, the one thing that carries the most weight is the consistency between publications. The children’s social care sector has developed its own data warehouses and analysis tools. Each time the structures and formats change, this is a further burden on local authorities to rewrite the code to manipulate the data back into the required structure.

The first three barriers have been solved in other fields and their solutions are starting to attract the attention of children’s services. There are common standards on how data should be defined, organised and described, and the access of information through Application Programming Interfaces (APIs) proves to be more consistent than spreadsheets. We know that the Department of Education (DfE) are looking at both solutions, although children’s social care data appears to be behind education data in the queue for improvements. Whilst we wait for these to be implemented, there could be merit in developing and sharing scripts that normalise data to help with the automation of cleaning up issues such as inconsistent approaches to dates and missing data.

By contrast, inconsistencies between publications feels like a challenge to be met with intelligent and inclusive partnership working rather than technology. Increasingly, we see organisations who provide data or define data standards, such as the DfE and Ofsted, starting to recognise that seemingly minor decisions about how to label and organise data can have very significant impacts for local government. We have seen an openness to involve children’s services in those decisions to show them how best to help us, and to work collaboratively. Formalising and expanding those arrangements could open the possibility of saving substantial amounts of time for local government staff, and also enable a new generation of analytical tools to be built upon more predictable and comparable data.

The improvements in technology and partnership working continue to drive the change to centralised data manipulation, the benefit of this is that children’s services can focus on providing valuable and insightful local analysis that makes a real difference to the decisions of children’s services in their area.

Jean Mallo, Performance Manager Children’s Services Performance Team Wandsworth Borough Council and co-chair of LIEG (London Information Exchange Group)

This post is part of a series published by the Rees Centre on data. The Rees Centre welcomes guest blog posts from professionals across the sector. Views expressed are the authors’ own and do not represent those of the Rees Centre.

Related posts:

Using data tools in local authorities children’s services 

Exploring the complexities of children’s social care data

Related network:

Children’s Social Care Data User Group

The Children’s Social Care Data User Group (CSC DUG) was set up in 2017 by the Rees Centre, University of Oxford and Thomas Coram Research Unit, UCL. It is a network of academics, local authority data managers, analysts, charities and funders with a shared vision that administrative data from children’s social care and other relevant agencies in England can be analysed and fed back into policy and practice to improve the way that children’s social care services respond to children, young people and their families.

The group focuses on data submitted to and analysed by the Department for Education (namely the SSDA 903 children looked after data and the Children in Need Census data).

Membership is open to any individual or organisation who is using, or plans to use, children’s social care data collated by the Department for Education (Child in Need data, Looked After Children data or Section 251 social care expenditure data in relation to children’s services).

To join the group’s mailing list: email rees.centre@education.ox.ac.uk

Analysis of aggregate administrative data

By Emily Buehler, Rees Centre.

This post is part of a series published by the Rees Centre on data (see below for related posts).

Secondary analysis of administrative datasets can provide key insight into children’s social care services’ practices and outcomes. The Department for Education collates data from a range of government sources that is made publicly available through the Local Authority Interactive Tool (LAIT).  Over 500 data items at the local authority level are available and span a range of topics related to the safety and well-being of children and young people.

Indicators included in the LAIT tool can be used by researchers, analysts and practitioners to explore relationships and trends that can then be used to inform decision-making, resource-allocation, policies, and social work practices. With any research utilising secondary data, it is important to consider how outcomes of interest are defined and measured through the use of specific items. Integral to the process is the identification of an outcome, a decision on which data item indicators can be used to measure that outcome, and justification for why such an outcome is important in the given context (La Valle et al, 2019). However, in the case of children’s social care administrative data analysis, the analysis, findings and interpretation of findings are not always so straight forward. Constructing a narrative of change or progress through data can be vulnerable to biases in what indicators are chosen, how they are analysed, and how results are interpreted. It is, therefore, paramount to acknowledge the complexities of using this data to inform assessments about the qualities of local authorities.

The following illustration highlights how preliminary insights from analysis should be reviewed carefully and critically with regard to how they explain an outcome.

Perhaps I am interested in drawing conclusions about the stability of the social worker workforce in the Outer London region. This outcome is important to consider because workforce stability has implications for continuity of care, service effectiveness, and economic costs (La Valle et al, 2019; Hussein et al, 2015). The Children’s Commissioner Office produces an index of social worker stability; their research has found labour market indicators such as rates of turnover, vacancies and agency staff are the strongest predictors of a looked after child experiencing multiple social worker changes (Clarke, 2019). In this example, I will be exploring turnover rate. This rate is calculated by dividing the number of leavers by the number of social workers in place at year end (Department for Education, 2019). The following plot highlights the turnover rates for each Outer London authority at two time points: in 2014 and in 2018.

 

Local authorities at the top of this plot have had the most dramatic increases in social work staff turnover over the past four years and those at the bottom of the plot have largest decreases in rates of staff turnover. Shorter lines in between dots represent very small changes in these rates. In drawing conclusions back to the outcome of interest, workforce stability, it is useful to interrogate these findings a bit more. As such, it could be useful to compare the extremes of these changes. In Bromley, rates of turnover have decreased by nearly 20%. In Havering, turnover has increased by approximately the same amount. In explaining these changes and their potential impacts, it is useful to consider the journeys of both over this period of time.

Bromley’s children’s care services were judged Inadequate overall when inspected by Ofsted in June 2016 and made a substantial improvement at their most recent inspection, earning an overall Good rating in January of 2019.  Similarly, Havering’s children’s social care services have improved from a rating of Requires Improvement to be Good in December 2016 to Good as of July 2018. In thinking about the improvement journeys of both local authorities, there may be several possible explanations for large discrepancies in turnover rates and what they may indicate in regard to the impact of workforce stability. High staff turnover in 2018 around the time of an improved rating could be associated with an organisational restructure where ineffective staff may have left the local authority. Low turnover at a time of improved rating could mean that stability in the workforce has led to more effective service delivery. Both explanations could be equally plausible.

This explanation may also be supported by examining the changes in Ofsted ratings for other local authorities in the region that have experienced large changes in rates of staff turnover in either direction. Waltham Forest, Hounslow, Brent, and Hillingdon are four other local authorities that had turnover rates of over 25% in 2014 that have since decreased by around 10%. All four were judged as requiring improvement during inspections that occurred in 2014-2015 but received an overall rating of Good at their most recent inspections in 2018-2019. Barnet has also experienced a shakeup of their staff, with turnover increasing by over 15% over this period. Like Havering, this coincides with an improvement in their overall Ofsted rating (moving from Inadequate in 2017 to Good in 2019). These patterns may suggest that both large changes in social work staff and a transition to a more consistent workforce can have beneficial effects on the local authority.

These initial findings can be supplemented by deeper explorations into annual trends of turnover or by making comparisons to other indicators relating to the stability of the workforce, such as the rates of vacancies or staff absences over the year. Turnover alone may not be an accurate measure of stability, and stability itself may not be a concept that always equates to the most effective services and good practices. It may be important to consider external factors that may influence the movement of social workers, such as neighbouring local authorities offering financial incentives or promises of lighter workloads (ADCS Safeguarding Pressures Phase 6 Main Report, 2018). The social worker workforce does not operate in a vacuum and the complexities of a constantly-changing social care system influence these measures of stability. Changes in policy, budgets and grant funding, and relationships with Ofsted under the new ILACS inspection framework place added pressures on local authorities. Given these limitations, analysis of aggregate administrative data should be acknowledged as an important first step on a research journey, but it should not be the last.

This blog post is written by Emily Buehler, Research Officer at the Rees Centre.

It is part of a series published by the Rees Centre on data.

Related posts:

Using data tools in local authority children’s services 

Data in the right format

Related network: Children’s Social Care Data User Group

The Children’s Social Care Data User Group (CSC DUG) was set up in 2017 by the Rees Centre, University of Oxford and Thomas Coram Research Unit, UCL. It is a network of academics, local authority data managers, analysts, charities and funders with a shared vision that administrative data from children’s social care and other relevant agencies in England can be analysed and fed back into policy and practice to improve the way that children’s social care services respond to children, young people and their families.

The group focuses on data submitted to and analysed by the Department for Education (namely the SSDA 903 children looked after data and the Children in Need Census data).

Membership is open to any individual or organisation who is using, or plans to use, children’s social care data collated by the Department for Education (Child in Need data, Looked After Children data or Section 251 social care expenditure data in relation to children’s services).

To join the group’s mailing list: email rees.centre@education.ox.ac.uk

Alastair Lee

This blog post is written by Alastair Lee, Children’s Services Data and Information Manager, East Sussex County Council  and Chair of the Children’s Services National Performance and Information Management Group. It is part of a series published by the Rees Centre on data.

Data Tools in Local Authority Children’s Services

In every local authority there are people using data to monitor the performance of services. Ideally this supports a conversation between managers and frontline staff to help them get a better understanding of the demand on the service, the pressure on staff and the impact on children, young people and families being supported.

Producing the data in an easily understandable format, whether this is as tables, charts or dashboards does not happen at the press of a button. A lot of data wrangling is needed to get the data in a fit state to feed the reports.

Data wrangling

Most of the data we get doesn’t come in a useable format. As a result we need to manipulate it to combine multiple reports, add a specific data column or extract specific data to meet our needs. The most basic approach to this is cutting and pasting, but as this can become very long winded and tedious we have developed tools that speed this up. Whether using advanced Excel formulae, VBA or SQL or the specific data manipulation aspects of Tableau, PowerBI, Knime, Alteryx or R the basic aim is the same. Build something that speeds up the process, reduces the tedium and as a result improves accuracy. My team have a shared folder where these tools are kept and whether used weekly, monthly or annually they are incredibly valuable.

Presenting data

Once we have the data we need in the right format to use we then plug it into a data visualisation tool. Currently the programme most often used for this is Excel simply because all staff have it on their PC or laptop and so it is easy to deploy widely. Also if an Excel dashboard is developed for a specific purpose (e.g. the ChAT[1] or the Children’s Social Care Benchmarking tool that were developed by the Data to Intelligence project) I can download it, add my own data and get an output without any need to install any new software. But Excel has limitations when working with large, complex datasets and in how the data can be visualised. As a result work is being developed using more bespoke programs (e.g. Tableau, QlikView and PowerBI) and once a local authority has these installed and a way of deploying to colleagues, the same ability to share templates and visualisations is true.

Analysing data

When analysing data Excel is still the most common tool used, it’s what we’ve got! Some local authorities use SPSS and R is beginning to appear. This is being driven by expertise arriving with new staff who have used these programmes elsewhere. The analysis is being driven by a greater interest from service managers about longer term impact of interventions/services, the impact of changes to services and the need to forecast more accurately.

A place to share

The situation at the moment is that many local authorities have developed their own data wrangling, visualisation and analysis tools and some won’t because they don’t have the people with the skills to do so. This leads to duplication of effort between some local authorities and other not accessing tools that would help improve outcomes for children and young people. This is a waste!

To address this the Children’s Services National Performance and Information Management Group (CS-NPIMG), the South East Sector Led Improvement Programme (SESLIP), the Data to Intelligence project, Ofsted and Social Finance’s Collaborative Technology Initiative are developing a curated data tools library where these tools can be hosted, shared and co-developed. The project is in its very early days but we have good learning from the open source movement, and from the development and sharing of the ChAT which is now used by 150 LAs and the SE Data Tools library which has looked at the impact sharing a tool can have on the local authority that shares it.

One unexpected consequence of our current experience of sharing tools is that it improves data quality in the statutory returns.

This happened because it enabled us to see what the data would look like once processed by the Department for Education, previously we’d only know this once a submission was made and an error report was returned. As the tools are open source, we can all see how the data we enter is being transformed to create the output and this can reveal where errors may have arisen in the past. This has also contributed discussions about the development of standard data sets that we can all use for analysis, visualisation and research. This is all from a very limited number of shared tools; there is more work to be done to increase the number of tools for visualisation, data wrangling and analysis that will help improve outcomes for children and families in need of support.

[1] The ChAT is the Children’s Services Analysis Tool that was developed by a group of London LAs and Ofsted to better visualize the data that is shared between and local authority Children’s Services department and Ofsted during an inspection.

This blog post is written by Alastair Lee, Children’s Services Data and Information Manager, East Sussex County Council  and Chair of the Children’s Services National Performance and Information Management Group.
Contact Alastair: Alastair.Lee@eastsussex.gov.uk

It is part of a series published by the Rees Centre on data.

Related network:

Children’s Social Care Data User Group

The Children’s Social Care Data User Group (CSC DUG) was set up in 2017 by the Rees Centre, University of Oxford and Thomas Coram Research Unit, UCL. It is a network of academics, local authority data managers, analysts, charities and funders with a shared vision that administrative data from children’s social care and other relevant agencies in England can be analysed and fed back into policy and practice to improve the way that children’s social care services respond to children, young people and their families.

The group focuses on data submitted to and analysed by the Department for Education (namely the SSDA 903 children looked after data and the Children in Need Census data).

Membership is open to any individual or organisation who is using, or plans to use, children’s social care data collated by the Department for Education (Child in Need data, Looked After Children data or Section 251 social care expenditure data in relation to children’s services).

To join the group’s mailing list: email rees.centre@education.ox.ac.uk

How do we know if children’s social care services make a difference?

The development of an Outcomes Framework based on the views of those who plan, deliver and use these services, as well as the existing evidence base, aims to go some way to answering this question.

More about this project

The final report published by the Rees Centre was launched at the Nuffield Foundation in London on 11 July.