Alumni of the Masters in Educational Assessment, Lorena Garelli and Kevin Mason presenting their dissertation research.
Trinity’s School of Education and the Educational Research Centre, Drumcondra, hosted AEA-Europe’s Annual Conference on 9-12 November in Dublin, Ireland.
Over 350 attendees from 37 countries reflected on the conference’s theme – “New Visions for Assessment in Uncertain Times.” This diverse range of attendees included over 15 folks affiliated with OUCEA. Throughout the conference, attendees explored possible directions for assessment policy and practice in schools, higher education, and vocational/workplace settings over the coming years. Much of the reflection centered on the instability of the recent past – the pandemic, war in Ukraine, and economic challenges globally have created a sense of uncertainty in all spheres of life. As a result, attendees took stock and reimagined assessment in a world where the certainties of the past decades have given way to a more uncertain environment.
Keynote speeches addressed such diverse topics as “Assessing learning in schools – Reflections on lessons and challenges in the Irish context,” “Assessment research: listening to students, looking at consequences,” and “Assessment research: listening to students, looking at consequences.”
In addition to the keynotes, the conference hosted panel and poster presentation opportunities. Many members and associates of the OUCEA shared their research. For example:
Honorary Norham Fellow
- Lena Gray – presented on assessment, policymakers, and communicative spaces – striving for impact at the research–policy interface
Honorary Research Associate
- Yasmine El Masri – an OUCEA Research Associate – presented on Evaluating sources of differential item functioning in high-stakes assessments in England
- Samantha-Kaye Johnston – an OUCEA Research Officer – presented on Assessing creativity among primary school children through the snapshot method – an innovative approach in times of uncertainty.
Current doctoral students
- Louise Badham – a current D.Phil Student – presented on Exploring standards across assessments in different languages using comparative judgment.
- Zhanxin Hao – presented on The effects of using testing and restudy as test preparation strategies on educational tests
- Jane Ho – presented on Validation of large-scale high-stakes tests for college admissions decisions
MSc in Educational Assessment graduates and students
- Kevin Mason – presented on Assessment of Art and Design Courses using Comparative Judgment in Mexico and England
- Lorena Garelli – presented on Assessment of Art and Design Courses using Comparative Judgment in Mexico and England
- Joanne Malone – presented on Irish primary school teachers’ mindset and approaches to classroom assessment
- Merlin Walters – presented on The comparability of grading standards in technical qualifications in England: how can we facilitate it in a post-pandemic world?
As you can see from the wide-ranging topics covered, OUCEA is engaging in wide-ranging research. The team looks forward to presenting more of our work at AEA-Europe’s 2023 conference in Malta.
Dr Neil Harrison, Deputy Director
For some years now, prospective students applying through the UCAS system have been given the option of declaring whether or not they are care-experienced. Aside from helping statisticians, this self-identification information is passed confidentially to their university when they join to help them to target additional support such as bursaries, accommodation, academic help and mental health interventions.
There has been concern about how effective this system is. For example, we know informally that some care-experienced students are reluctant to tick the box as they are worried about stigma or that it will negatively impact on their university application. Some applicants may not realise that they were in care if they were young or if it meant living with relatives in a kinship care arrangement. Furthermore, not all students enter higher education through the UCAS system.
Anecdotally, there are also some people who tick the box when they are not care-experienced. These applicants may not understand the question – perhaps think it’s about caring for other people – or tick it by accident.
False positives and false negatives
There are thus two issues. The ‘false positives’ who say they are care-experienced when they are not; these create a bit of extra work (to do the checking) and are potentially a source of error in statistics. However, the ‘false negatives’ are more concerning. These are students who should be entitled to additional support from their university, but who are not getting it because their university doesn’t know they are care-experienced. It is obviously useful for policy and practice to know how many false positives and negatives there are.
The data that we’ve assembled for one of our projects has enabled us to shine a partial light on the self-identification data. It doesn’t completely answer the questions as there are significant gaps in the data we have – we will touch on these later. However, it does give us some useful clues for the first time which we thought it would be useful to share informally.
Exploring the data
We have anonymous data for England relating to the cohort of people born in the 1995/96 school year and who remained in England between 11 and 18 – about half a million in total. We have been able to link data over time to combine care histories from the age of 8 (when the national data begins) and higher education up to the age of 21. Therefore, we know (a) whether the student’s university believes them to be care-experienced based on self-identification, and (b) whether they had indeed been in care.
To complicate matters, the university can allocate the student to one of two care-experienced categories. The definitions for these are very unclear, but we believe they are broadly intended to represent care leavers (meeting the statutory definition) and other care-experienced students.
The table above summarises what we have found, based on the data that were held at the end of the student’s first year. There isn’t space here to cover everything, but some basic observations:
- It’s clear that universities are not collectively using the two care-experienced markers appropriately, with nearly half of care leavers are actually recorded in the ‘wrong’ category. The national data is therefore poor at differentiating between statutory care leavers and other care-experienced students.
- However, about 85% of statutory care leavers are being appropriately classified as care-experienced through self-identification. The other 15% are split between those stating that they are not care-experienced (i.e. false negatives) and for whom the data are missing (perhaps due to refusal).
- The system is also reasonably good at identifying other students who were in care after the age of 14, with 75% self-identifying, although 17% had stated that they were not care-experienced and 32% have been wrongly classified as statutory care leavers.
- However, students who were in care between the ages of 8 and 14 were much less likely to self-identify as care-experienced – only 28% did so, with over half explicitly saying that they were not care-experienced.
- The ‘children in need’ group are not care-experienced (having been allocated a social worker, but not entered care), but there was a small proportion (3%) who had self-identified as such (i.e. false positives).
- The same was true for the general population. The proportion was very small, but this represented over 500 individuals. Some of these are undoubtedly false positives, but others may have been in (and left) care before the age of 8, including those adopted from care.
Implications for policy and practice
This small piece of analysis is not intended to be the final word and it is limited in some important ways. For example, we only have higher education data up to 2016/17 and the situation has almost certainly improved somewhat since then, with markedly more attention on care-experienced students over the last five years. We also only have data on younger students aged between 18 and 21, so the situation may be different for those entering higher education at a later stage. However, there are some useful lessons from the data:
- Firstly, the way in which data is being recorded by universities varies widely and this is likely to be leading to confusion, both in the provision of support and in understanding who is entering higher education. I am aware that the Office for Students is currently seeking to address this with the Higher Education Statistics Agency, UCAS and universities, which is a very positive step.
- Secondly, there is clearly some degree of incorrect self-identification – this is likely to be mainly accidental and probably reflects misunderstanding about what constitutes ‘care’ in this context. Nevertheless, this does mean that the self-identification data cannot be taken at face value and does need to be subject to confirmation by universities, creating a small administrative burden to ensure that support is correctly directed at those entitled to it. This requires universities to have a good understanding of care and a mechanism to enable students to evidence their status as sensitively as possible.
- Thirdly, a sizable proportion of care-experienced students of various categories are being missed by the self-identification system, especially among those who left care prior to their teenage years. This suggests that there is much more work to be done to ensure that care-experienced students are aware of the benefits of self-identifying and feel able to do so without stigma. Clearly, however, they must always have the right to not share this information about themselves if they prefer – or to do so at a later date.
A positive development in recent years is that many universities have broadened out their support – extending out beyond statutory care leavers and removing age thresholds. This is to be welcomed as it is not just younger care leavers who experience educational disruption and who can benefit from additional help to enter, and thrive within, higher education. These data suggest, however, that there is still work to be done to reach all those who are entitled to receive it.
I am delighted to chair the evidence group for Josh MacAlister’s review of the care system, described by the Secretary of State who launched it as a “wide-ranging, independent review to address poor outcomes for children in care as well as strengthening families to improve vulnerable children’s lives.”
Josh MacAlister and the review team published their opening position on Thursday 17th June, a statement on the case for change.
The review has heavy billing, not least as the level of government borrowing is higher now than when a previous Conservative chancellor demanded austerity. The current administration may be more inclined to spend but will they spend on children outside of universal services?
The review is also in the shadow of the recent review of the system in Scotland, which was much longer, was evidently led by those with experience of care, and reported into a devolved administration that has a clear articulation of its commitment to deliver the rights of children.
The pressure is certainly on. The case for change sets out Josh’s interpretation and that of the review team of what they have heard so far, in listening to and reading the evidence of personal testimony, academic research, expert views and from responses submitted to the review. From this, the case for change indicates where the review team think the system needs to change.
It is very welcome that the review is publishing the case for change so that everyone with an interest knows where the review will focus and is able to respond on the more specific issues. I know that Josh and the team are very open to all responses on these questions and I know that they are listening. People can respond here.
There are two old cliches about how those outside positions of power in government might best engage in the business of government. One cliché is the lift test, “what do you do if you find yourself in a lift with a senior figure?” How to cut through, what to say, how to be heard? The second cliche concerns a train leaving the station. You don’t get to decide when the train runs, your choice is whether to ride the train.
The Evidence Group is one of three groups providing support to the review. Of particular importance is the Experts by Experience Board, there to ensure a voice in the review for those who have had a social worker (either themselves or a child in their care). The Design Group will help guide how the review designs its recommendations.
In my work I have tried to bring good evidence to bear on policy and practice and help ensure it is used meaningfully and accurately. To do this we need a clear idea about what we mean by good evidence, by what counts as evidence, for who, about what, applied how? We might call this an episteme, a framework of agreement about what counts as knowledge and how it should be interpreted, which allows us to settle on truths or at least determine what we mean by truthfulness, in how we answer research questions intended to inform decisions.
As chair of the evidence group, I can say that the review team have had access to a great deal of high quality evidence of multiple sorts on multiple questions. In the time available Josh and the team have made their interpretation of what it says about what should change in the care system, focusing particularly on the side of problems and issues requiring attention, rather than the many daily successes and positive outcomes that make up so much of existing practice and experience.
The members of the evidence group appointed by the review have submitted their views on the review team’s reading of the evidence and on the team’s interpretation and representations of it. Ultimately the case for change is not primarily an evidentiary paper in the sense of being set up as a research or science project with a clear technical methodology to address a narrow scientific or social scientific question. It isn’t subject to formal peer review and approval in the way that a National Statistic or an academic paper might be. Neither I nor the other group members get to sign off the document. It is ultimately the view of Josh and his team and that is in part what is meant by an independent review. Another reviewer might have looked at the evidence differently and made a different case or called for different changes.
I hope it leads to a fruitful discussion. For what it is worth I think the field suffers from a lack of agreement about what counts as good evidence. Because of the nature of the evidence as yet available and the diversity of views on it, many of the issues in the case for change are subject to considerable uncertainty and disagreement so it is likely that debate will continue.
I don’t think the question of the appropriate balance between statutory care of children and wider support to families is resolved by the evidence available, nor do we know enough in aggregate about what structures best help people provide the right supports to which groups of children at the right times. I agree it is good to have a debate about these things. The available evidence can inform and there will be more evidence gathering in the next stages of the review.
I hope that the review goes on to make valuable and effective recommendations that address many of the issues and challenges raised in evidence to the review and that these lead to real improvements to the experiences of children, families and care experienced people. I hope that the review is able to address the clear call from those with care experience to be heard, not just in the review, but in perpetuity. Finally, I hope the review addresses the need to improve knowledge and understanding both in terms of about how the care system might be improved but also in helping the public and hence government recognise the work of and hear the voices of care experienced people, children, social workers, carers, directors of children’s services and others who are too often drowned out of the public debate.
We will all have differing views on all of this. I hope we will have more blogs in the weeks ahead.
Read more about the case here.
Knowledge as the Anchor
In a recent conversation with one of our Derbyshire Attachment Aware Schools (AAS) they said – ‘AAS has been our anchor during this COVID pandemic storm’ . They discussed with me how the knowledge gained through the Attachment Aware Schools programme about attachment, trauma, brain development, reactions to our world and life events, had really benefited their school. They stated how potent being an AAS school has been, in terms of their understanding and ability to deal with the challenges currently being faced in education. The AAS knowledge has given them a strong foundation to build their recovery curriculum upon, and they report how stabilising this has been at a time of such unprecedented complications to the life and routines in school and the world at large.
This school and many other AAS schools in Derbyshire have reported feeling more resilient and confident to support the children and staff in their setting to make the ‘best’ of a really difficult time, personally and professionally.
“We felt as well prepared as we could be when the children returned after lockdown because of being on the Derbyshire AAS programme. We knew we would see challenging behaviour, and other changes, and felt we had a secure knowledge base with which to address any issues as they arose.”
The Derbyshire Attachment Aware Schools Programme
AAS is designed to work with schools and settings to explore human development and behaviour, and how this affects learning. The programme fills an identified gap in human development and relational practice that many teachers and school staff express they did not experience in their initial professional training. AAS enables schools to re-examine their practice, policies and systems to develop a whole school ethos where relationships are truly understood to be the cornerstone of learning.
“Being part of the AAS programme and schools’ network has given us the confidence to look at our whole school’s provision- prior to the pandemic, and now moving out of lockdown, and to think carefully about what we want to retain/reinstate in the future. We can look at all of this through a trauma-informed perspective and think about what’s really best for our school community.”
So what is an Attachment Aware School ?
The Attachment Aware Schools programme offered to schools in Derbyshire, is a whole school learning and development programme. Using attachment theory and neuroscientific knowledge as an underpinning theoretical framework, we explore behaviour and the impact that poor early life and traumatic experiences can have on human growth, learning and development. Schools on the programme deepen their understanding of human behaviour and relationships through a yearlong series of taught inputs and supported action research projects. The learning journey is designed to help schools focus on the unique set of circumstances that constitute their school community and how best to address the needs and challenges that will inevitably arise in an intergenerational working community.
“Our increased understanding of attachment needs has influenced policy, systems and most importantly support for our students at every level. The whole ethos of school has changed. We now have the resilience to take risks and support each other to meet the challenges of our most vulnerable students.”
Ethos, Mindset and the Golden Thread
Our AAS programme is designed to develop mindset, ethos and practice in schools and education settings. It is not a toolkit of prescribed interventions or practices; as helpful as resources can be, they don’t always have the sustainable impact that we know schools want and need to truly embed and maintain new and more effective ways of working. Our mission is to help bring about a renewed understanding and approach to behaviour to maximise the potential and outcomes of children, young people and, in fact, all those who work and learn together in education.
The ‘golden thread’ that holds all of our ‘graduate’ schools together in our AAS network across the county is an understanding of the importance of building and maintaining good relationships: young person to adult, young person to young person, and adult to adult. Placing this understanding of the impact of human dynamics at the heart of school ethos and practice to build a safe and nurturing learning environment where all learners, and their educators can flourish.
What has been the impact to date?
We have seen improvements in:
- Relationships in school
- School experience for pupils
- Levels of anxiety, stress and worry
- Effectiveness of policies and communication systems
- Staff attitudes to work
- Student behaviour – lower level of incidents and disruption
- Academic progress and attainment
…and best of all – better relationships and a deeper understanding of the needs of children, young people and colleagues, to ensure the best experience and outcomes in every school day.
Attachment Aware Programme Lead
Image attribution: Anchor created by freepik – www.freepik.com
Dr Neil Harrison, Deputy Director of the Rees Centre
The team in the Department for Education (DfE) that produces statistics on progression to higher education have really upped their game recently. Starting with a trial last December, they are now publishing an annual digest of statistics looking at a wide range of demographic and educational groups, helpfully including a backwards time series. The latest of these digests was published a couple of weeks ago and covers the 2018/19 academic year. Importantly, these statistics are based on linking – at the individual level – the data collected by universities with that collected by schools and colleges, providing a rich lens to understand inequalities in the system.
Interestingly, one of the groups explored is care leavers. I have written before about issues with the statistics produced from the data collected by local authorities (the so-called ‘SSDA903’ data) and the new DfE digest represents a significant step change as it reflects definitive records about who has gone on to higher education, including in further education colleges and private providers.
It’s also important to note that the definition of ‘care leaver’ used is slightly quirky, in that it is not the statutory one. The definition used for analysis is those children in care continuously for the 12 months up to 31st March in the academic year when they turned 16 (i.e. Year 11 for the vast majority). In other words, the definition captures only those with a good degree of stability, although they may have changed placements in this time. It effectively excludes most of those entering care at 14 or 15.
What do the new statistics say?
The statistics in the digest reflect progression to higher education by the age of 19 – i.e. allowing for one ‘gap’ year after school/college. There are issues with this that I will return to shortly. The data focused on English young people, but includes (most) higher education elsewhere in the UK. For the purposes of this blog post, I’ve brought together several of the groups covered by the digest into the time series chart below:
We look first at the blue line representing care leavers. The progression rate for 2018/19 was 13%. This is more than double the oft-(mis)quoted 6% figure that comes from the SSDA903 dataset and I am confident this is much more a realistic reflection of the situation. There has been a pretty steady rise from 9% in 2009/10, with a couple of one year blips, which is also good news. This fits well with what universities say – I hear many reports of a year-on-year growth in care leavers and other care-experienced students.
However, the yellow line shows the situation for young people who are not care leavers and this starkly demonstrates a persistent inequality – the progression rate for this group was 43% in 2018/19. If anything, the gap between the blue and yellow lines has widened slightly over the ten years of the time series, from 25 percentage points in 2009/10 to 30 percentage points in 2018/19. This is worrying, as it suggests that care leavers have not been able to expand their ‘share’ of higher education at the same rate as other young people.
As I discussed in my 2017 ‘Moving On Up’ report, it is important to remember that there are strong explanatory factors at work and when you compare care leavers with similar demographic and educational profiles, much of this difference disappears. For example, care leavers are significantly more likely to have special educational needs which impact on their attainment and therefore on their ability to pursue higher education – at least in the short term. We will almost certainly never be in a position to eliminate the gap, but we should collectively be aiming for these lines to converge over time.
How do care leavers compare to other disadvantaged groups?
The green line represents young people who were eligible for free school meals when they were in Year 11. There are, once again, issues with this definition and what it means, but this is a useful broad proxy for children who grew up in economically disadvantaged households. The 2018/19 progression rate for this group was 26% and therefore double that of care leavers. Again there has been a widening of the gap across the time series, from 10 percentage points to 13 percentage points.
Finally, the red line – for which only four data points are available – represents children designated as being ‘in need’ on 31st March in the academic year when they turned 16. Interestingly, the higher education progression rate for this group is actually slightly lower than for the care leaver group – e.g. 11% in 2018/19.
This is consistent with other analysis, including the Rees Centre’s recent report (with the University of Bristol) looking at educational outcomes for children in need. More research is needed to understand this fully, but it suggests that long-term and stable care placements – often, if not always – support progression to higher education in comparison to other young people experiencing profound challenges within their birth family.
Why is looking at progression at age 19 an issue?
All quantitative analysis of social data is driven by definitional issues. These are rarely neutral or objective – you have to decide what groupings to use, how you determine the boundaries and so on. As discussed, the new DfE digests use a particular definition of a care leaver – if they used a different definition, the analysis would yield different results.
One decision is about time cut-offs. This is always tricky. The longer timeframe you look at, the less reliable the historic data become – if they exist at all. The DfE’s cut-off at the age of 19 is a longstanding one and makes sense for the general population who most commonly progress immediately after school/college or after a gap year.
However, as I’ve shown elsewhere, this does not hold for care-experienced students. The social and educational disruption they undergo as a result of their care journeys means that they are often not qualified or ready to pursue higher education at 18 or 19. In fact, most that do go to university, do so in their 20s or even later in life. We don’t yet know for sure, but it is likely that something like 25-30% of care-experienced people will undertake higher education at some point in their life.
This is still not high enough, but the DfE digest – useful as it is – can only ever be part of the story and the blue and yellow lines would be closer if a longer timeframe were used.
A final note…
It is always important to remember that progression into higher education is only one side of the coin and that there is good evidence that care leavers and other care-experienced students are at greater risk of leaving higher education early. It would be great to see some official figures from the DfE on this at some point, to help us to understand the scale of the problem.
Contact Neil: email@example.com
By Dr Rebecca Eynon (Associate Professor between the Department of Education and the Oxford Internet Institute) & Professor Jo-Anne Baird (Director of the Department of Education)
Now that the infamous Ofqual algorithm for deciding the high-stake exam results for hundreds of thousands of students has been resoundingly rejected, the focus turns to the importance of investigating what went wrong. Indeed, the office for statistics regulation has already committed to a review of the models used for exam adjustment within well specified terms, and other reviews are likely to follow shortly.
A central focus from now, as students, their families, educational institutions and workplaces try to work out next steps, is to interrogate the unspoken and implicit values that guided the creation, use and implementation of this particular statistical model.
As part of the avalanche of critique aimed at Ofqual and the government, the question of values come in to play. Why, many have asked, was Ofqual tasked, as they are every year, with avoiding grade inflation as their overarching objective? Checks were made on the inequalities in the model and they were consistent with the inequalities seen in examinations at a national level. This, though, begs the question of why these inequalities are accepted in a normal year.
These and other important arguments raised over the past week or so highlight questions about values. Specifically, they raise the fundamental question of why, aside from the debates in academia and some parts of the press, we have stopped discussing the purposes of education. Instead, a meritocratic view of education, promoted since the 1980s by governments on the right and left of the spectrum has become a given. In place of discussions about values, there has been an ever increasing focus on the collection and use of data to hold schools accountable for ‘delivering’ an efficient and effective education, to measure student’s ‘worth’ in ways that can easily be traded in the economy, and to water down ideas of social justice and draw attention away from wider inequalities in society.
Once debates about values are removed from our education and assessment systems, we are left with situations like the one now. The focus on creating a model that makes the data look like past years – with little debate over whether the aims should have been different this year is a central example of this. Given the significant (and unequal) challenges young people have faced during this year, should we not, as a society have wanted to reduce inequalities in our society in any way possible?
The question of values also carries through into other discussions of the datafication of education, where the collection and analysis of digital trace data, i.e. data collected from the technologies that young people engage with for learning and education, is growing exponentially. Yet unlike other areas of the public sector like health and policing, schools rarely have a central feature in policy discussions and reports of algorithmic fairness. The question is why? There are highly significant ethical and social implications of extensive data use in education that significantly shape young people’s futures. These include issues of privacy, informed consent and data ownership (particularly due to the significant role of the commercial sector); the validity and integrity of the models produced; the nature of the decisions promoted by such systems; and questions of governance and accountability. This relative lack of policy interest in the implications of datafication for schooling is, we suggest, because governments take for granted the need for data of all kinds in education to support their meritocratic aims, and indeed see it as a central way to make education ‘fair’.
The Ofqual algorithm has brought to our attention the ethics of the datafication of education and the risk that poses of compounding social inequalities. Every year there is not only injustice from the unequal starting points and the unequal opportunities young people have within our schools and in their everyday lives, but there is also injustice in the pretence that extensive use of data is somehow a neutral process.
In the important reflections and investigations that should now take place over the coming weeks and months there needs to be a review that explicitly places values and ethical frameworks front and centre, that encourages a focus on the purposes of education, particularly in times of a (post-) pandemic.
By Celine Gross, associate at Social Finance. Part of a series published by the Rees Centre on data (see below for related posts).
As a data scientist focusing on Children’s Services, I’ll admit that the first things which spring to mind when you mention Annex A and CIN Census are “inconsistent” and “designed for compliance”. That said, these datasets hold a wealth of information and my team found a powerful way to use them to provide evidence of the quality of children’s journeys through social care. It may also be useful for you.
For those who don’t know them, Annex A and CIN Census are children’s services datasets created by local authorities. Annex A is a document prepared for Ofsted inspections of children’s services departments, listing all events that happened in the last six months (contacts, referrals, child protection plans, etc.) and some information on children looked after (care leavers, adopters, etc.). CIN Census are submitted by local authority children’s services annually to the Department for Education and only contain details of children ‘in need’ (being assessed or under a child plan) from the past fiscal year.
What Works for Children’s Social Care provided a grant to the team at Social Finance to facilitate this work. We were also in receipt of grant funding from the Christie Foundation. Both streams of funding allowed us to explore the use of data and how it could be used to identify promising practice. Although the project had some unintended outcomes, we were able to create and share tools such as template Data Processing Agreements which can help other partnerships between Local Authorities (LAs) or between LAs and partners to work together safely and ethically on analysing individual-level data.
From ‘events’ to ‘journeys’
We took the lists of events from Annex A and CIN Census and turned them into ‘child journeys’– strings of events concerning the same individual. We did it using Python but you could do the same with R and other programming languages. This allows us to make journeys themselves the objects of analysis, or to analyse events based on what happens before or after them.
A simple journey, simply expressed looks like this:
With a bit of visualisation, a more complex journey looks like this, over an 18-month timeframe:
We found that analysing journeys opened new possibilities for looking at the quality of processes of a system, rather than at their compliance.
Often, this starts with a practice insight: ‘X shouldn’t generally happen after Y’ – allowing us to define an archetypal journey which our local authority partners are interested in, and see how often it is happening, and in what contexts.
One example was when we focussed on ‘potentially missed risk at assessment’ – a journey where a child’s first assessment did not result in a plan (higher-level intervention) but a subsequent referral within 6 months did.
Some numbers jumped out for social care service managers and quality assurance directors:
· In some local authorities, these ‘potentially missed risk’ journeys were happening 25% more often with some demographic groups than others, and more often with certain types of risk – identifying opportunities for quality audit and the types of cases first-line managers should review;
· In some local authorities, they happened more often at the weekends and at busy times of year – putting some numbers on things managers had long suspected to be true.
Many of these patterns were present in some local authorities but not others, highlighting possibilities for sharing of good practice.
Can you use journeys in your work?
These are just a few examples of what journey analysis can do. There are many more interesting journey archetypes and use cases that could turn this journey analysis into better decisions, targeted service improvement work and interesting research. We’re hoping to partner with universities and with local authorities to take this forward.
We’re keen to share what we learnt. We have started publishing our journey code on GitHub. We’re excited to see how it could be improved and how it could be used in local authorities (with around 600 analysts across England), by researchers and by others. We’ve already benefitted massively from being able to check our interpretations of how the data represents reality with analysts and managers in four local authorities, and we hope that GitHub can become a place to grow that collaboration.
Please feel free to add your own tools as well as to improve the ones we have started. You might also be interested in the code that cleans Annex A according to the Ofsted guidance here.
What Works for Children’s Social Care will soon share a report detailing aspirations for the project, what we learned and what we see as possible and useful in this type of work. Please visit their website for details.
There is a growing community of data-minded people, passionate about supporting better decision-making in the children’s services space. Let’s continue collaborating and building on each other’s learning: if you see the potential of child journeys, please get in touch! And do share your ideas: what tools did you create?
This blog post was written by Celine Gross from the consultancy Social Finance.
Contact Celine: firstname.lastname@example.org
It is part of a series of posts published by the Rees Centre on data. The Rees Centre welcomes guest blog posts from professionals across the sector. Views expressed are authors’ own and do not represent those of the Centre.
Related Rees Centre blog posts:
Children’s Social Care Data User Group
The Children’s Social Care Data User Group (CSC DUG) was set up in 2017 by the Rees Centre, University of Oxford and Thomas Coram Research Unit, UCL. It is a network of academics, local authority data managers, analysts, charities and funders with a shared vision that administrative data from children’s social care and other relevant agencies in England can be analysed and fed back into policy and practice to improve the way that children’s social care services respond to children, young people and their families.
The group focuses on data submitted to and analysed by the Department for Education (namely the SSDA 903 children looked after data and the Children in Need Census data).
Membership is open to any individual or organisation who is using, or plans to use, children’s social care data collated by the Department for Education (Child in Need data, Looked After Children data or Section 251 social care expenditure data in relation to children’s services).
To join the group’s mailing list: email email@example.com
By Ellie Suh (Research Officer, Rees Centre, Department of Education)
The Cost Calculator for Children’s Services (CCfCS, or Cost Calculator) is a research-based purpose-built tool that helps local authorities to assess and analyse the costs of providing social care to children in care. The current Cost Calculator team includes Dr Lisa Holmes (director of Rees Centre), Helen Trivedi and myself. I am currently taking part in the SUCCESS programme which, funded by Aspect, provides training, support and funding to help social scientists transform innovative and marketable ideas into a business or social enterprise. In this blog, I explain the motivation for taking the Cost Calculator to the social enterprise model.
The tool was initially developed to facilitate analysis for academic research conducted by Dr Lisa Holmes and colleagues to understand the relationship between the needs, costs and outcomes of children in care. The research aimed to move away from aggregate costs that did not reflect the detail of the children’s social care. For this reason, costs were often understood at an aggregate level or using per head cost. Of course, these are useful summary figures; however, they do not provide analysis that is sufficiently detailed to assess cost-effectiveness or to inform strategic decision-making in service provisioning or commissioning.
The research team used a bottom-up unit cost methodology which makes it possible to analyse cost at various levels of detail, such as for an individual child, a cohort of children or by needs group. The tool can provide costs by placement type or service providers and to provide what-if analysis. Having more analysis provides greater room for local authorities to monitor progress and to evaluate new initiatives in a systematic and transparent manner. Around 50 local authorities at different points have been either engaged in research or worked with the tool, and there are a number of case studies that highlight the benefits that the analysis has provided.
The tool was initially developed as an Access database. Technology and the uses of data have evolved. Over the past three years, the research team have also moved from the University of Loughborough to the Rees Centre, Oxford University, so have used this time to reflect and plan next steps. This is why the Cost Calculator is moving to a web-based platform. As technology advances, users expect a more secure, efficient and intuitively designed interface. More local authorities are paying attention to collecting data and maintaining its quality. We have also seen more use of data dashboards and data visualisations. The sophisticated analytics provided by the Cost Calculator could empower local governments that are looking to make informed decisions and to strategise in response to increased demand for children’s social care.
To move to the next stage of development of the Cost Calculator we consider that the social enterprise model is best aligned to our needs. While it will make no profit, the social enterprise model provides an operable structure that enables both maintenance and development of the tool. Revenue will be generated through a clearly defined and affordable pricing model, which improves the financial sustainability of this social initiative. Revenue generation will be utilised for supporting the end-users and keeping up with the latest technology. Without a sustainable business model to support continuous development, the contribution of this academic research is likely to be short-lived.
The team is working on setting up the social enterprise so that this academic research can continue to make a contribution to a wider community in a meaningful and sustainable way. I will keep you updated on my journey through the SUCCESS programme – watch this space.
Get in touch with Ellie – firstname.lastname@example.org
By Dr Velda Elliott (Associate Professor of English and Literacy Education and Director of Doctoral Research)
The DPhil in Education is an advanced research degree awarded on the basis of a thesis and an oral examination. If studying full-time, the programme takes between 3-4 years to complete. By the end of the programme, graduates are equipped with a wide range of research skills as well as in-depth knowledge, understanding and expertise in their chosen field of Education research.
Graduates of the Oxford Education DPhil are all over the world embarking on an amazing range of careers. Below are just a handful of examples of how our former students are currently making a positive difference in the world within the sphere of Education.
Graduates In The Third Sector and Beyond
Lila McDowell (DPhil in Education, 2012) is the Deputy Director of Hudson Link, a non profit organisation which provides college education, life skills and re-entry support to incarcerated and formerly incarcerated men and women to help them make a positive impact on their own lives, their families and communities, resulting in lower rates of recidivism, incarceration and poverty. In her ‘spare’ time she teaches classes at John Jay College of Criminal Justice. Read her article on the need for formerly incarcerated leaders for prison education programmes here.
Mahmoud Natout (also DPhil in Education, 2014) founded a consultancy in the Lebanon which works to enable organisations to develop play-based learning and to leverage the power of play to unlock human potential for creativity and innovation in all stages of life. Ashish Jaiswal (DPhil in Education, 2011) is a freelance author and consultant. He has spoken and taught all around the world. Fluid, his latest book, assimilates lessons from the approach applied by geniuses through out history and offers a fresh model of learning and thinking. His previous book, How to Reform a Business School, was based on his thesis, a multi-year case study of how Yale transformed its business school.
Aqeela Datoo (DPhil in Education, 2013) is Strategic Partnerships Manager at the Aga Khan Foundation, supporting programmatic design of education worldwide.. Before this role she spent three years ‘on the ground’ with the Aga Khan Foundation in India, leading their education programmes. Hannah Grainger Clemson (DPhil in Education, 2011) is Schools Policy Officer at the European Commission, where as well as using the expertise developed during her doctorate in her work, she has followed up her Rugby Blue from Oxford by becoming part of a European Camogie team, which won the World Games last year!
Ariel Liu (DPhil in Education, 2013) bridged the gap between the academic and the non-academic world. After graduating from Oxford, Ariel worked at the Stanford University Department of Education as a post-doctoral research fellow, where she researched digital video collaboration and the use of mobile computers in education. She then joined Google, first as a User Experience Researcher for Search and Maps, and now as a Senior User Experience Researcher and manager. Currently, she leads a team of researchers building user-first experiences for Google Search, Maps, and Assistant.
Graduates In Higher Education
Many of our alumni work as Higher Education lecturers, including Natalie Lundsteen (DPhil in Education, 2011) who is Assistant Dean for Career and Professional Development at University of Texas Southwestern. She visited the department earlier this year and delivered an excellent session on thinking about careers post MSc or DPhil. Patrick Alexander (DPhil in Education, 2011) is Director of Research in the Oxford Brookes Faculty of Education, just across town, so is also a frequent visitor back to the department. Tania Saeed (DPhil in Education, 2013) is Assistant Professor at Lahore University of Management Sciences, Pakistan, where she has published two books since completing her DPhil, including one based on her thesis, co-authored with her supervisor Dr David Johnson. Nick Hopwood (DPhil in Education, 2007) is Associate Professor at the University of Technology, Sydney, where he also maintains a strong public engagement presence, including recording his ‘wall of failure’ as a public service to academia! (Nick originated the still immensely popular Advanced Qualitative Research course at the Department, which supports advanced doctoral students with their data analysis.) Prachi Srivastava (DPhil in Education, 2005), who co-authored a methods article with Nick when they were both students, is Associate Professor in Education and International Development at Western University, Canada. She is also a Member of the World Bank Expert Advisory Council on Citizen Engagement.
Mitsuko Matsumoto (DPhil in Education, 2012) is a lecturer at the Autonomous University of Madrid; James Hall (DPhil in Education, 2009) is Associate Professor at the University of Southampton.
Others work in non-academic roles in universities, including Aly Kassam-Remtulla (DPhil in Education, 2012), who is now Associate Provost for International Affairs and Operations at Princeton University. Gill Houston chairs the UK Council for Graduate Education – but to be fair she had a lifetime of working in Research Student policy and practice before she completed her doctorate with us!
Graduates in the department!
And of course there are quite a few of us still in Oxford… I’m one, as is Susan James Relly, the Associate Head of the Social Sciences Division and Alis Oancea, Professor of Philosophy of Education and Research Policy and Director of Research. Steve Puttick, our Associate Professor of Teacher Education is another, as is James Robson, Departmental Lecturer in Higher Education and Jessica Briggs Baffoe-Djan, Departmental Lecturer in Applied Linguistics. Both our History educators, Katharine Burn, Associate Professor of History Education and Jason Todd, Departmental Lecturer, also gained their DPhils in the department. Recent graduate Natalie Usher is an Educational Development Advisor with the University’s Centre for Teaching and Learning.
To discover more about the DPhil in Education at the department, visit: www.education.ox.ac.uk/programmes
Join the Oxford Education DPhil Community
What did an Oxford Education DPhil do for you? Let us know by joining the newly created LinkedIn Alumni Group, or search ‘DPhil Education Oxford Alumni’ on LinkedIn.
Meet some of the alumni mentioned in this post: