Care leavers in England are over ten times more likely than their peers to be not in education, employment or training (NEET) in their 21st year, major new analysis shows.
Overall, nearly one-third were NEET compared to just 2.4 per cent in the general population and 13 per cent of 21-year-olds. The vast majority of these were defined as ‘economically inactive’ due to disability – including mental health issues – or caring responsibilities. Among those care leavers who were working, over two-thirds were in precarious roles that were short-term, part-time or poorly paid.
The study was funded by the Nuffield Foundation and based at the Rees Centre at the University of Oxford. It was led by Dr Neil Harrison (now at the University of Exeter) and Jo Dixon (University of York).
Neil Harrison said: “This is the first study of its kind to explore over time what happens to care leavers and other care-experienced young people in early adulthood. We have been able to document the acute challenges they face in making positive transitions towards stability and wellbeing.”
“What we clearly see in the data is that the legacy of earlier disadvantages, such as childhood trauma or disruptions to schooling, gets cemented in early adulthood. While around a quarter of care leavers were able to access higher education or stable work by their 21st year, the majority were reliant on benefits or precarious employment. Urgent action is needed to remedy this.”
Researchers used data, including the newly available Longitudinal Educational Outcomes, or LEO, dataset for young people born between 1st September 1995 and 31st August 1996. A total of 3,850 out of the 530,440 individuals were care leavers and 28,810 had some experience of the children’s social care system. They also interviewed 28 care leavers and 41 professionals across five local authorities, including personal advisers, leaving care team members, virtual school staff and carers.
The research shows a strong link between economic inactivity and higher levels of special educational needs during Key Stage 4, including attending a special school. This was particularly marked for care leavers, of whom 62.4 per cent were identified as having a high level of need.
Neil Harrison said: “Good GCSE grades – especially in English and mathematics – had a very strong role in determining which onward pathways were available. However, many care leavers were not able to attain as highly as they might due to what was going on their lives. This reinforces the vital importance of ‘second chance’ pathways, especially through further education colleges.”
Those interviewed said the support of extended family and other social networks was essential to them finding jobs and transitioning to adult life. Care leavers and professionals reported practical barriers in accessing youth employment schemes like Kickstart. They supported care leavers being given preferential access to employment opportunities by councils as part of their ‘corporate parenting’ responsibilities.
Jo Dixon said: “More can be done to remove barriers and disincentives to work for care-experienced young people. This includes addressing the impact of low minimum wage rates for under 23s in employment and apprenticeships, who are without parental support and thus carry financial responsibility for rent and living costs. This is a particular priority for young people in expensive supported accommodation, which can make taking up work-related opportunities unviable.”
“There is already scope to implement ring-fenced and supported work-related opportunities specifically for care-experienced young people. Guaranteed interviews, targeted and supported work-experience schemes and dedicated employment opportunities should be on offer. Utilising corporate parenting and corporate social responsibility in this way will benefit care-experienced young people and the local labour market.”
Rob Street, Director of Justice at the Nuffield Foundation said: “This important study highlights the range of challenges that young care leavers face in accessing the education, employment, and training opportunities that underpin transition into adulthood. The report makes a number of well-evidenced, practical recommendations to national and local policymakers and others for measures to assist this often multiply-disadvantaged group of children and young people”
Recommendations from the study include:
- Providing strong routes for young people to go into (and back into) post-16 education and training
- National government should provide additional ‘top up’ funding for care leavers to participate in apprenticeships and other schemes to ensure that they are not financially disadvantaged
- Young people leaving care between 14 and 16 should be considered as an ‘at risk’ group with respect to complex transitions into adulthood.
- Stronger links with local employers to improve young people’s knowledge of the range of opportunities available to them.
- Targeted pre-employment and pre-apprenticeship support to prepare young people with the most complex needs to take steps towards work-related opportunities.
- Education providers and employers should have greater awareness of trauma and mental health needs for care leavers and other care-experienced young people.
The views and experiences of over 7,500 children and young people in care on their contact with family members and impact on their wellbeing are uncovered in a new report published today by the charity Coram Voice and The Rees Centre at University of Oxford.
Staying Connected finds that nearly a third (31%) of children (aged 8-10) and a quarter (25%) of young people (aged 11-18) felt they were seeing their mothers too little, whilst over a fifth (22%) of children and 18% of young people felt they were seeing their fathers too little. 22% of children didn’t feel they had enough contact with their brothers and sisters, and this figure was higher for young people (31%). About one in five young people had no contact with either parent and this was particularly the case for those in residential care and boys.
Visits being arranged at inconvenient times, long distances, the costs of travel, their family’s circumstances, and workers failing to make necessary arrangements were among reasons cited by children and young people for seeing family less often than they wanted. Children in care who felt they saw family members too little reported feeling sad, angry and unsettled, while in contrast, those who felt contact arrangements were “just right” felt they were being listened to and looked forward to seeing their family.
One young person (aged 11-18) commented: “I want to see my family more. My social worker is supposed to be doing police checks. I have been here since September and the checks have not been done. It’s not like I can just visit. I live five hours from home.”
Whether children and young people felt that they saw parents often enough was statistically associated with length of time in care, type of placement and which local authority was caring for them. Analysis shows that young people (aged 11-18) in residential care more frequently reported that they had too little contact with family compared to young people in other types of placements. The number of placements experienced also had an impact, with 60% of young people who had only had one placement reporting they were satisfied with their contact frequency, compared to 39% who had experienced 11 or more placements.
In addition, 50% of young people surveyed did not feel involved in decisions social workers made about their lives, and half of the comments about involvement focused on contact arrangements. Children and young people commented on arrangements being inflexible, not changing as they got older or as their family’s circumstances changed. One child (aged 8-10) commented: “I used to see Mum and older brother three times a week. It has been cut down to once a week and this makes me sad. I don’t know why contact was cut down.”
Comments also highlighted that children and young people wanted to see extended family members, pets and other adults who were important to them, and that the key people in their lives were not always included in contact plans.
- Staying Connected is the latest report to be published as part of the Bright Spots programme* and it makes seven key recommendations to improve policy and practice:
- Work with all children in care to identify the key relationships in their lives
- Make arrangements for children and young people to maintain contact, develop relationships and reconnect with people who are important to them
- Listen to and involve children and young people in decisions about the arrangements to see and keep in touch with family and others who are important to them
- Keep children in care informed about their families, why they can or cannot see them, and what arrangements have been made for them to spend time together
- Ensure plans are regularly reviewed and reflect the current circumstances, wishes and needs of children and young people and their families
- Normalise family time whenever possible, minimising the use of contact centres and supporting children and families to meet in the community
- Make sure the workforce has the skills and knowledge to prioritise and confidently support children in care to stay connected to the people who are important to them
Linda Briheim-Crookall, Head of Policy and Practice Development at Coram Voice, said: “The recent Care Review suggested the primary objective of the care system should be promoting the formation of lifelong loving relationships around children in care and care leavers. This can only be achieved if more is done to build rather than break relationships with the people who are already important to children in care. Our research showed that there is still some way to go to make this happen. Services and workers must listen to children and young people about who they want to see, when and how and seek to make this happen. Children in care should have the opportunity to spend time with the people who are important to them doing everyday things like playing games, having a meal or going for a walk with the dog.”
Julie Selwyn, Professor of Education and Adoption at The Rees Centre at University of Oxford, said: “While previous UK research has emphasised that the quality of contact is more important than the frequency, from young people’s perspective frequency was equally, if not more important. Feeling contact was ‘just right’ was associated with higher levels of wellbeing. Staying connected to the important people in life is essential for children’s wellbeing. Greater efforts need to be made to ensure that this is achieved for all children in care.”
To read the full report, watch a video on the findings and download resources for agencies and local authorities, please visit coramvoice.org.uk/staying-connected-report.
Findings Report and Draft Guidelines Published
The full peer reviewed research report and draft guidelines, grounded in systematic research with 8 local authority areas and corresponding health trusts in England and Wales, are published today.
The research identified consensus among frontline practitioners and parents about what constitutes best practice when local authorities issue care proceedings at birth – but also uncovered numerous challenges, ranging from discontinuities, delays and resource constraints to risk-averse practice, shortfalls in a family-inclusive practice, insufficient professional specialism and poor inter-agency collaboration. The need for a more consistent sensitive approach to practice, underpinned by understandings of trauma is emphasised. The need for more training, supervision and support for professionals working in this emotionally challenging area of practice is also recommended.
The draft guidelines, grounded in the research include a series of aspirational statements for each stage of the parents’ journey and provide examples of how these statements can be translated into best practice. They consider how to overcome challenges at both a strategic level and in frontline practice. They also include examples of innovations from practice drawn from across England and Wales.
Between now and August 2022, the participating local authorities and NHS trusts are working with the team to test the feasibility of the guidelines. The intention is for the guidelines to be used as a basis for developing local area action plans and locality specific guidelines, within the context of national guidance. Findings from this feasibility study will inform a final version of the guidelines, which will be published later in 2022.
Accompanying reviews led by the Oxford Team also published 24th February
Two additional reviews undertaken as part of the research and led by the Rees Centre are also published today.
The first, a review of guidance in eight participating local authorities covers professional practice concerning parent/infant separation within the first few days of life. Whilst the second, an evidence review of families’ experiences of perinatal loss, identifies key messages that may be applicable to practice surrounding separation at birth. Both reports provide important background and context when considering improving practices surrounding separation at birth.
Read more about Born into Care on the project page. and on the University of Oxford website.
Ecorys, the Rees Centre at University of Oxford, and Ipsos MORI have been appointed by the Department for Education to explore the potential of a seminal study to independently research the needs, experiences and outcomes for children and young people leaving care on Adoption Orders (AOs) and Special Guardianship Orders (SGOs).
There is currently limited research around how these two routes to permanence affect children’s long-term outcomes as they progress into adolescence and adulthood. We hope to follow the lives of young people aged 12-21 growing up in adoption and special guardianship families.
The purpose is to help:
- Assess the long-term outcomes for children growing up in adoption and special guardianship families;
- Support improved outcomes for children by enhancing our understanding of what influences the support needs and outcomes for adoptive families and special guardianship families;
- Understand the role of key stakeholders in supporting outcomes for previously looked after children, and the impact this has on their outcomes; and
- Support improved decision making by LAs and courts on permanency options for children who cannot return home to live with their birth parents.
Over the next six months, we will conduct a feasibility study to explore how best to approach families and encourage involvement in a longitudinal study. We will consult with stakeholders from the adoption, Special Guardianship sector and families to help us design the research and make plans to pilot the next stage.
The final reporting is scheduled for 2028.
More information on the project can be found on the study’s project page.
I am delighted to chair the evidence group for Josh MacAlister’s review of the care system, described by the Secretary of State who launched it as a “wide-ranging, independent review to address poor outcomes for children in care as well as strengthening families to improve vulnerable children’s lives.”
Josh MacAlister and the review team published their opening position on Thursday 17th June, a statement on the case for change.
The review has heavy billing, not least as the level of government borrowing is higher now than when a previous Conservative chancellor demanded austerity. The current administration may be more inclined to spend but will they spend on children outside of universal services?
The review is also in the shadow of the recent review of the system in Scotland, which was much longer, was evidently led by those with experience of care, and reported into a devolved administration that has a clear articulation of its commitment to deliver the rights of children.
The pressure is certainly on. The case for change sets out Josh’s interpretation and that of the review team of what they have heard so far, in listening to and reading the evidence of personal testimony, academic research, expert views and from responses submitted to the review. From this, the case for change indicates where the review team think the system needs to change.
It is very welcome that the review is publishing the case for change so that everyone with an interest knows where the review will focus and is able to respond on the more specific issues. I know that Josh and the team are very open to all responses on these questions and I know that they are listening. People can respond here.
There are two old cliches about how those outside positions of power in government might best engage in the business of government. One cliché is the lift test, “what do you do if you find yourself in a lift with a senior figure?” How to cut through, what to say, how to be heard? The second cliche concerns a train leaving the station. You don’t get to decide when the train runs, your choice is whether to ride the train.
The Evidence Group is one of three groups providing support to the review. Of particular importance is the Experts by Experience Board, there to ensure a voice in the review for those who have had a social worker (either themselves or a child in their care). The Design Group will help guide how the review designs its recommendations.
In my work I have tried to bring good evidence to bear on policy and practice and help ensure it is used meaningfully and accurately. To do this we need a clear idea about what we mean by good evidence, by what counts as evidence, for who, about what, applied how? We might call this an episteme, a framework of agreement about what counts as knowledge and how it should be interpreted, which allows us to settle on truths or at least determine what we mean by truthfulness, in how we answer research questions intended to inform decisions.
As chair of the evidence group, I can say that the review team have had access to a great deal of high quality evidence of multiple sorts on multiple questions. In the time available Josh and the team have made their interpretation of what it says about what should change in the care system, focusing particularly on the side of problems and issues requiring attention, rather than the many daily successes and positive outcomes that make up so much of existing practice and experience.
The members of the evidence group appointed by the review have submitted their views on the review team’s reading of the evidence and on the team’s interpretation and representations of it. Ultimately the case for change is not primarily an evidentiary paper in the sense of being set up as a research or science project with a clear technical methodology to address a narrow scientific or social scientific question. It isn’t subject to formal peer review and approval in the way that a National Statistic or an academic paper might be. Neither I nor the other group members get to sign off the document. It is ultimately the view of Josh and his team and that is in part what is meant by an independent review. Another reviewer might have looked at the evidence differently and made a different case or called for different changes.
I hope it leads to a fruitful discussion. For what it is worth I think the field suffers from a lack of agreement about what counts as good evidence. Because of the nature of the evidence as yet available and the diversity of views on it, many of the issues in the case for change are subject to considerable uncertainty and disagreement so it is likely that debate will continue.
I don’t think the question of the appropriate balance between statutory care of children and wider support to families is resolved by the evidence available, nor do we know enough in aggregate about what structures best help people provide the right supports to which groups of children at the right times. I agree it is good to have a debate about these things. The available evidence can inform and there will be more evidence gathering in the next stages of the review.
I hope that the review goes on to make valuable and effective recommendations that address many of the issues and challenges raised in evidence to the review and that these lead to real improvements to the experiences of children, families and care experienced people. I hope that the review is able to address the clear call from those with care experience to be heard, not just in the review, but in perpetuity. Finally, I hope the review addresses the need to improve knowledge and understanding both in terms of about how the care system might be improved but also in helping the public and hence government recognise the work of and hear the voices of care experienced people, children, social workers, carers, directors of children’s services and others who are too often drowned out of the public debate.
We will all have differing views on all of this. I hope we will have more blogs in the weeks ahead.
Read more about the case here.
By Jean Mallo, Performance Manager, Children’s Services Performance Team, Wandsworth Borough Council and co-chair of LIEG (London Information Exchange Group).
This post is part of a series published by the Rees Centre on data (see below for related posts).
The five structural and formatting barriers to turning numbers into intelligence, or Data to Insight.
Local authority resources are dwindling. There is an increased demand on children’s services to make better and quicker decisions to ensure that the right action is taken promptly to improve the lives of children and families. As a result, there is a renewed pressure on performance teams to turn numbers into intelligence, or Data to Insight.
Each manual step taken to manipulate data into a usable structure and format before analysis can result in human error. More critically it leads to duplicated effort in 151 local authorities across England. To put this into perspective, if a table of published data requires just 5 minutes of manual preparation, then almost two days’ of an analyst post across England was spent on formatting rather than on understanding or learning.
The five most common types of structural and formatting problems taking up analysts’ time:
Hidden or blank rows and columns
What you see is not what you get. Copying what appears to be a simple five-by-five table results in a messy muddle that requires manual deletion of rows and/or columns and unmerging cells before it can be used in reports or used as underlying data for analysis.
Numbers and dates stored as text or strings
For a number or a date to become insight it needs to be understood in the wider context. For example – is the number high or low, is there an upward trend or a decline, what is the duration between the two dates? Numbers and dates reported as text or string need to be changed before these questions can be answered.
Dealing with missing data
Turning what appears to be ‘no data’ into genuine ‘no data’ can be a time-consuming task – a space may take the place of what appears to be an empty cell, or it may be filled with a letter or a symbol. The rules change from one source to another, sometimes even within the same organisation or publication.
Machine readable is not always human readable
On the one hand, the ‘tidy data format’ is an analyst’s dream for converting numbers into tables and visualisations using the latest software and technology. On the other, the standard table format is easy to read and understand not only by local authority analysts but also by the public.
Inconsistencies between publications
Regardless of the format or structure chosen by an organisation, the one thing that carries the most weight is the consistency between publications. The children’s social care sector has developed its own data warehouses and analysis tools. Each time the structures and formats change, this is a further burden on local authorities to rewrite the code to manipulate the data back into the required structure.
The first three barriers have been solved in other fields and their solutions are starting to attract the attention of children’s services. There are common standards on how data should be defined, organised and described, and the access of information through Application Programming Interfaces (APIs) proves to be more consistent than spreadsheets. We know that the Department of Education (DfE) are looking at both solutions, although children’s social care data appears to be behind education data in the queue for improvements. Whilst we wait for these to be implemented, there could be merit in developing and sharing scripts that normalise data to help with the automation of cleaning up issues such as inconsistent approaches to dates and missing data.
By contrast, inconsistencies between publications feels like a challenge to be met with intelligent and inclusive partnership working rather than technology. Increasingly, we see organisations who provide data or define data standards, such as the DfE and Ofsted, starting to recognise that seemingly minor decisions about how to label and organise data can have very significant impacts for local government. We have seen an openness to involve children’s services in those decisions to show them how best to help us, and to work collaboratively. Formalising and expanding those arrangements could open the possibility of saving substantial amounts of time for local government staff, and also enable a new generation of analytical tools to be built upon more predictable and comparable data.
The improvements in technology and partnership working continue to drive the change to centralised data manipulation, the benefit of this is that children’s services can focus on providing valuable and insightful local analysis that makes a real difference to the decisions of children’s services in their area.
Jean Mallo, Performance Manager Children’s Services Performance Team Wandsworth Borough Council and co-chair of LIEG (London Information Exchange Group)
This post is part of a series published by the Rees Centre on data. The Rees Centre welcomes guest blog posts from professionals across the sector. Views expressed are the authors’ own and do not represent those of the Rees Centre.
Related posts:
Using data tools in local authorities children’s services
Exploring the complexities of children’s social care data
Related network:
Children’s Social Care Data User Group
The Children’s Social Care Data User Group (CSC DUG) was set up in 2017 by the Rees Centre, University of Oxford and Thomas Coram Research Unit, UCL. It is a network of academics, local authority data managers, analysts, charities and funders with a shared vision that administrative data from children’s social care and other relevant agencies in England can be analysed and fed back into policy and practice to improve the way that children’s social care services respond to children, young people and their families.
The group focuses on data submitted to and analysed by the Department for Education (namely the SSDA 903 children looked after data and the Children in Need Census data).
Membership is open to any individual or organisation who is using, or plans to use, children’s social care data collated by the Department for Education (Child in Need data, Looked After Children data or Section 251 social care expenditure data in relation to children’s services).
To join the group’s mailing list: email rees.centre@education.ox.ac.uk
Analysis of aggregate administrative data
By Emily Buehler, Rees Centre.
This post is part of a series published by the Rees Centre on data (see below for related posts).
Secondary analysis of administrative datasets can provide key insight into children’s social care services’ practices and outcomes. The Department for Education collates data from a range of government sources that is made publicly available through the Local Authority Interactive Tool (LAIT). Over 500 data items at the local authority level are available and span a range of topics related to the safety and well-being of children and young people.
Indicators included in the LAIT tool can be used by researchers, analysts and practitioners to explore relationships and trends that can then be used to inform decision-making, resource-allocation, policies, and social work practices. With any research utilising secondary data, it is important to consider how outcomes of interest are defined and measured through the use of specific items. Integral to the process is the identification of an outcome, a decision on which data item indicators can be used to measure that outcome, and justification for why such an outcome is important in the given context (La Valle et al, 2019). However, in the case of children’s social care administrative data analysis, the analysis, findings and interpretation of findings are not always so straight forward. Constructing a narrative of change or progress through data can be vulnerable to biases in what indicators are chosen, how they are analysed, and how results are interpreted. It is, therefore, paramount to acknowledge the complexities of using this data to inform assessments about the qualities of local authorities.
The following illustration highlights how preliminary insights from analysis should be reviewed carefully and critically with regard to how they explain an outcome.
Perhaps I am interested in drawing conclusions about the stability of the social worker workforce in the Outer London region. This outcome is important to consider because workforce stability has implications for continuity of care, service effectiveness, and economic costs (La Valle et al, 2019; Hussein et al, 2015). The Children’s Commissioner Office produces an index of social worker stability; their research has found labour market indicators such as rates of turnover, vacancies and agency staff are the strongest predictors of a looked after child experiencing multiple social worker changes (Clarke, 2019). In this example, I will be exploring turnover rate. This rate is calculated by dividing the number of leavers by the number of social workers in place at year end (Department for Education, 2019). The following plot highlights the turnover rates for each Outer London authority at two time points: in 2014 and in 2018.
Local authorities at the top of this plot have had the most dramatic increases in social work staff turnover over the past four years and those at the bottom of the plot have largest decreases in rates of staff turnover. Shorter lines in between dots represent very small changes in these rates. In drawing conclusions back to the outcome of interest, workforce stability, it is useful to interrogate these findings a bit more. As such, it could be useful to compare the extremes of these changes. In Bromley, rates of turnover have decreased by nearly 20%. In Havering, turnover has increased by approximately the same amount. In explaining these changes and their potential impacts, it is useful to consider the journeys of both over this period of time.
Bromley’s children’s care services were judged Inadequate overall when inspected by Ofsted in June 2016 and made a substantial improvement at their most recent inspection, earning an overall Good rating in January of 2019. Similarly, Havering’s children’s social care services have improved from a rating of Requires Improvement to be Good in December 2016 to Good as of July 2018. In thinking about the improvement journeys of both local authorities, there may be several possible explanations for large discrepancies in turnover rates and what they may indicate in regard to the impact of workforce stability. High staff turnover in 2018 around the time of an improved rating could be associated with an organisational restructure where ineffective staff may have left the local authority. Low turnover at a time of improved rating could mean that stability in the workforce has led to more effective service delivery. Both explanations could be equally plausible.
This explanation may also be supported by examining the changes in Ofsted ratings for other local authorities in the region that have experienced large changes in rates of staff turnover in either direction. Waltham Forest, Hounslow, Brent, and Hillingdon are four other local authorities that had turnover rates of over 25% in 2014 that have since decreased by around 10%. All four were judged as requiring improvement during inspections that occurred in 2014-2015 but received an overall rating of Good at their most recent inspections in 2018-2019. Barnet has also experienced a shakeup of their staff, with turnover increasing by over 15% over this period. Like Havering, this coincides with an improvement in their overall Ofsted rating (moving from Inadequate in 2017 to Good in 2019). These patterns may suggest that both large changes in social work staff and a transition to a more consistent workforce can have beneficial effects on the local authority.
These initial findings can be supplemented by deeper explorations into annual trends of turnover or by making comparisons to other indicators relating to the stability of the workforce, such as the rates of vacancies or staff absences over the year. Turnover alone may not be an accurate measure of stability, and stability itself may not be a concept that always equates to the most effective services and good practices. It may be important to consider external factors that may influence the movement of social workers, such as neighbouring local authorities offering financial incentives or promises of lighter workloads (ADCS Safeguarding Pressures Phase 6 Main Report, 2018). The social worker workforce does not operate in a vacuum and the complexities of a constantly-changing social care system influence these measures of stability. Changes in policy, budgets and grant funding, and relationships with Ofsted under the new ILACS inspection framework place added pressures on local authorities. Given these limitations, analysis of aggregate administrative data should be acknowledged as an important first step on a research journey, but it should not be the last.
This blog post is written by Emily Buehler, Research Officer at the Rees Centre.
It is part of a series published by the Rees Centre on data.
Related posts:
Using data tools in local authority children’s services
Related network: Children’s Social Care Data User Group
The Children’s Social Care Data User Group (CSC DUG) was set up in 2017 by the Rees Centre, University of Oxford and Thomas Coram Research Unit, UCL. It is a network of academics, local authority data managers, analysts, charities and funders with a shared vision that administrative data from children’s social care and other relevant agencies in England can be analysed and fed back into policy and practice to improve the way that children’s social care services respond to children, young people and their families.
The group focuses on data submitted to and analysed by the Department for Education (namely the SSDA 903 children looked after data and the Children in Need Census data).
Membership is open to any individual or organisation who is using, or plans to use, children’s social care data collated by the Department for Education (Child in Need data, Looked After Children data or Section 251 social care expenditure data in relation to children’s services).
To join the group’s mailing list: email rees.centre@education.ox.ac.uk
Alastair Lee
This blog post is written by Alastair Lee, Children’s Services Data and Information Manager, East Sussex County Council and Chair of the Children’s Services National Performance and Information Management Group. It is part of a series published by the Rees Centre on data.
Data Tools in Local Authority Children’s Services
In every local authority there are people using data to monitor the performance of services. Ideally this supports a conversation between managers and frontline staff to help them get a better understanding of the demand on the service, the pressure on staff and the impact on children, young people and families being supported.
Producing the data in an easily understandable format, whether this is as tables, charts or dashboards does not happen at the press of a button. A lot of data wrangling is needed to get the data in a fit state to feed the reports.
Data wrangling
Most of the data we get doesn’t come in a useable format. As a result we need to manipulate it to combine multiple reports, add a specific data column or extract specific data to meet our needs. The most basic approach to this is cutting and pasting, but as this can become very long winded and tedious we have developed tools that speed this up. Whether using advanced Excel formulae, VBA or SQL or the specific data manipulation aspects of Tableau, PowerBI, Knime, Alteryx or R the basic aim is the same. Build something that speeds up the process, reduces the tedium and as a result improves accuracy. My team have a shared folder where these tools are kept and whether used weekly, monthly or annually they are incredibly valuable.
Presenting data
Once we have the data we need in the right format to use we then plug it into a data visualisation tool. Currently the programme most often used for this is Excel simply because all staff have it on their PC or laptop and so it is easy to deploy widely. Also if an Excel dashboard is developed for a specific purpose (e.g. the ChAT[1] or the Children’s Social Care Benchmarking tool that were developed by the Data to Intelligence project) I can download it, add my own data and get an output without any need to install any new software. But Excel has limitations when working with large, complex datasets and in how the data can be visualised. As a result work is being developed using more bespoke programs (e.g. Tableau, QlikView and PowerBI) and once a local authority has these installed and a way of deploying to colleagues, the same ability to share templates and visualisations is true.
Analysing data
When analysing data Excel is still the most common tool used, it’s what we’ve got! Some local authorities use SPSS and R is beginning to appear. This is being driven by expertise arriving with new staff who have used these programmes elsewhere. The analysis is being driven by a greater interest from service managers about longer term impact of interventions/services, the impact of changes to services and the need to forecast more accurately.
A place to share
The situation at the moment is that many local authorities have developed their own data wrangling, visualisation and analysis tools and some won’t because they don’t have the people with the skills to do so. This leads to duplication of effort between some local authorities and other not accessing tools that would help improve outcomes for children and young people. This is a waste!
To address this the Children’s Services National Performance and Information Management Group (CS-NPIMG), the South East Sector Led Improvement Programme (SESLIP), the Data to Intelligence project, Ofsted and Social Finance’s Collaborative Technology Initiative are developing a curated data tools library where these tools can be hosted, shared and co-developed. The project is in its very early days but we have good learning from the open source movement, and from the development and sharing of the ChAT which is now used by 150 LAs and the SE Data Tools library which has looked at the impact sharing a tool can have on the local authority that shares it.
One unexpected consequence of our current experience of sharing tools is that it improves data quality in the statutory returns.
This happened because it enabled us to see what the data would look like once processed by the Department for Education, previously we’d only know this once a submission was made and an error report was returned. As the tools are open source, we can all see how the data we enter is being transformed to create the output and this can reveal where errors may have arisen in the past. This has also contributed discussions about the development of standard data sets that we can all use for analysis, visualisation and research. This is all from a very limited number of shared tools; there is more work to be done to increase the number of tools for visualisation, data wrangling and analysis that will help improve outcomes for children and families in need of support.
[1] The ChAT is the Children’s Services Analysis Tool that was developed by a group of London LAs and Ofsted to better visualize the data that is shared between and local authority Children’s Services department and Ofsted during an inspection.
This blog post is written by Alastair Lee, Children’s Services Data and Information Manager, East Sussex County Council and Chair of the Children’s Services National Performance and Information Management Group.
Contact Alastair: Alastair.Lee@eastsussex.gov.uk
It is part of a series published by the Rees Centre on data.
Related network:
Children’s Social Care Data User Group
The Children’s Social Care Data User Group (CSC DUG) was set up in 2017 by the Rees Centre, University of Oxford and Thomas Coram Research Unit, UCL. It is a network of academics, local authority data managers, analysts, charities and funders with a shared vision that administrative data from children’s social care and other relevant agencies in England can be analysed and fed back into policy and practice to improve the way that children’s social care services respond to children, young people and their families.
The group focuses on data submitted to and analysed by the Department for Education (namely the SSDA 903 children looked after data and the Children in Need Census data).
Membership is open to any individual or organisation who is using, or plans to use, children’s social care data collated by the Department for Education (Child in Need data, Looked After Children data or Section 251 social care expenditure data in relation to children’s services).
To join the group’s mailing list: email rees.centre@education.ox.ac.uk
How do we know if children’s social care services make a difference?
The development of an Outcomes Framework based on the views of those who plan, deliver and use these services, as well as the existing evidence base, aims to go some way to answering this question.
The final report published by the Rees Centre was launched at the Nuffield Foundation in London on 11 July.