Skip to content

Department of Education

Viewing archives for Blog

By Victoria Bogdanova, DPhil student

It’s always nice to receive New Year wishes but some messages are really precious. This year I got a call from Petya (name changed), one of my former students. Today, he is a confident, handsome young man. He has a brilliant sense of humour and makes a lot of jokes. I was his maths teacher and tutor five years ago. And the story was very different back then.

For over 10 years I’ve been working for a Moscow charity called Big Change which supports care-experienced children and young people in their education. Petya was the first student for whom I was also a tutor, meaning that I not only taught him maths but I was also responsible for his individual learning plan and general life issues.

He was 17. Lived in an orphanage with his younger brother. Failed most of the exams last year. Had terrible relationships at school (he could be very naughty and revengeful when needed to defend himself). As the last hope, he was brought to our charity by his social worker.

Unfortunately, it was a typical situation. In state schools in Russia, teachers are often lacking time, resources and training to support teenagers with behavioural and educational difficulties. It’s not a secret that teenagers in care like Petya had experienced so much loss and trauma at a young age that it had led to severe gaps in learning. For example, Petya’s knowledge of maths and Russian was at the level of primary school when I first met him, even though he was supposed to pass secondary school exams in a few months. Failing these exams restricts further educational and career opportunities. Many of these young people also suffer from drug or alcohol misuse or have criminal records.

When I first met Petya, his hood covered his face, he never looked into other people’s eyes, and he said no more than a few words. What could I have talked to him about? Definitely not maths… rap was the only thing I knew he seemed to be interested in. It made me listen to rap songs at home, to have some topics in common. Surprisingly, it helped, and I celebrated a small victory when after a math class he looked around and said, “You should listen to Tupac, I think you might like it.”

Over a year of lessons with Petya, I learned a lot; how smart and quick-witted he was, how polite he tried to be (he apologised every time a swear word accidentally slipped his tongue in my presence), how deeply he loved his younger brother whom he had saved from starving when their mother had been drinking, how much he cared about his elderly grandmother who cooked her best bortsch for him every time he came to see her. Of course, I also learned a lot about teenagers’ slang and culture. Petya, in return, stopped wearing a hood, started smiling, raised his head and, hopefully, learned something about maths.

Petya passed some of his exams but not all and couldn’t continue his education. However, he now lives independently, supports his brother, has a job, and his employer appreciates him. The fact that he calls me every New Year’s eve makes me think that my work was not in vain. It’s not about maths, of course, but some trust to this world that was fostered in this once aloof young man.

In my PhD research, I’m not only trying to describe what approaches can help these young people on how they can catch up with their studies, but also find their confidence and thrive in life, and what teachers can do about it.

by Lucy Robinson, DPhil student

On 29th November 2023, I had the pleasure of attending and delivering a workshop at the SCiP Alliance’s Annual Conference, on the theme ‘Identity Matters’. The conference sought to gather “practitioners, researchers, policymakers and funders to focus our collective attention, expertise and effort on the question of Service child identity, on Service children’s identities” and explore “why identity matters, broaden our understanding of the diverse expressions of what it means to be a Service child, and how, by considering identity matters in our work, we can help Service children to thrive” (SCiP Alliance, 2023).

Opening the conference was a keynote from Phil Dent, Director of the SCiP Alliance, in which he conceptualised service children’s identities as ‘distinctive, diverse and dynamic’; a thread that ran throughout the day and very much underpins my own research. The conference had a packed schedule with exhibitions, a further keynote, a panel discussion and a range of workshops to attend. As a doctoral researcher in this field (indeed, my thesis title is: How does military life shape service children’s identity and school experiences?), the annual conference is the perfect space to share my research and foster new connections. Thankfully, the schedule allowed plenty of time for networking and gave me the opportunity to meet with colleagues, old and new. It was particularly lovely to connect in person with three PhD students in the early stages of their respective research projects.

In the morning, I ran a workshop titled ‘Translating child-centered creative research methods into the school context’, attended by a range of stakeholders. Over the hour, I gave a brief outline of my doctoral research (including its aims and link to school practice) before going into greater detail about one of the creative research methods that I had utilised in my research – the ‘All about me self-portrait and relational map’. I spoke about the rationale behind my choice of this method and how it linked to my research questions before sharing some examples created by my research participants.

To highlight the transferability of the method into the school context, I invited workshop attendees to make their own ‘All about me self-portrait and relational map’. Whilst doing so, I asked attendees to think about what questions it brought up for them and how they might translate it into their own professional contexts, whether in schools or not. It was gratifying to speak to one attendee who informed me that they would be bringing the activity back to their team to use as part of their mentoring programme for vulnerable ex-prisoners, to open a discussion on their identity after leaving the prison system.

The last 25 minutes or so of the workshop was dedicated to questions and discussion. It was a pleasure to talk in greater depth about details of my recruitment process, research ethics and to discuss the nuances around the relationship between service rank and socio-economic status and the potential impact this has on service children’s living experiences.

In the afternoon, I attended a fantastic workshop by Forces Children Scotland, a charity that supports children and young people from military families (serving and veteran), by providing opportunities and “amplifying their voices to inspire change” (Forces Children Scotland, 2023). In the workshop, I heard from young people from military families about their involvement in the charity’s various projects. I was particularly struck by how thoughtfully the charity had co-produced work with these young people to ensure their voices and experiences were embedded, rather than an afterthought. It was clear from my conversations with these young people about just how much they had benefited from their involvement in terms of gaining confidence and new skills and having a continuous form of support in place as their lives continued to be shaped by the military.

To find out more about the SCiP Alliance, have a look at their website or follow them on X (formally known as Twitter), @scipalliance.

Dr. Jo Bjørkli Helgetun discusses teaching in the digital age in this blog about his research into the teacher professionalisation app Teacher Tapp.

Teacher Tapp is a smart-phone app developed by the company Education Intelligence in June 2017, after originating as a Nesta and Gatsby foundation funded research project. The app migrated to the publicly funded higher education institution Arteveldehogeschool in Flanders, the Netherlands by the summer of 2020 where it is run on a licence from Education Intelligence. The app claims to be a voice for the teaching profession, a research tool, and a teachers’ professional development device. They do so through daily surveys of teachers where the results are available in the app the next day, as well as through the promotion of blogs through their “daily reads”. The app currently has close to 10,000 daily users in England and 2,000 daily users in Flanders, and its data has been cited in policy documents and parliamentary debates.

On the surface, the app is a neat package that functions the same way in England and Flanders and provides a form of professional voice-by-vote to its users. However, as we move beyond the app’s graphical interface, we start to see that the data streams take on multiple forms¹ with implications for our understanding of the teaching professions and the future of teachers’ professional lives in the digital age. These differences manifest in different spaces such as between England and Flanders, or between fora such as social media, blogs, policy documents, or parliamentary debates.

For example, there appears to be a great difference between England and Flanders in regards to the place of profit (and private entities in general) in education. In England, Education Intelligence sells a range of products such as the possibility to ask questions in the app or to have them conduct in-depth analysis based on data from the app. By contrast, this is not possible in Flanders as it would be seen as unethical. Instead, all questions are asked by researchers at Artevelde (users can suggest questions), and the data is only analysed by them for their use. This has led to something of a conundrum, where even though Education Intelligence can be said to offer a pay-to-speak model, they at least provide some recourse for people to influence what is on the agenda in the Teacher Tapp app that is not present in Flanders. Indeed, much resistance in Flanders towards the app by researchers seem to centre on this lack of opportunity for others to ask questions on the app. Moreover, there has been much general scepticism about the validity of the approach and even the place of such direct democracy (type of teachers’ voice) in a society historically characterised by social dialogue between unions, the state, and school organisations in the so-called “pillared society” of Belgium. Such dialogue is completely absent in England, where teachers and researchers in education alike often feel they are shouting at a wall as their voices go unheard.

Meanwhile, the fit of Teacher Tapp in the existing “evidence”-based paradigm in education policymaking in England is evident. For example, data from Teacher Tapp has been referenced four times in parliamentary debates to either defend or attack government policy, where the source of data seems arbitrary and the method of delivery follows well established practices of reading short numbers-centred sound bites from a prepared list of answers. Moreover, when cited in the white paper “Opportunity for all: Strong schools with great teachers for your child” the discourse centred on too many primary teachers having to do their own planning (a failure of schools) and was used to set up new policy goals. The policy as presented in the paper did not come from a deeper study with much scientific rigour, and the question was a minor point in a Teacher Tapp blog that was focused primarily on phonics, but it is a classic example of policymakers shoehorning in numbers that fit the narrative. Unsurprisingly, the reported main problem in regards to planning – a lack of time – was ignored in the white paper.

Interestingly, the data as presented in the cited Education Intelligence blog’s text had also undergone a transformation in relation to the raw data (which was also presented), because they employed their own discourse. These results were themselves weighted, meaning they differed from the results shown in the apps graphical interface. This illustrates how data can be presented in different forms as it moves from a mobile app into a blog and/or a policy document.

These observations, and others², arguably reveal a range of things. Firstly, as new technology is created and spreads across different borders in our ever-globalising world, they take on new forms and are received differently based on pre-existing local structure and cultures. Secondly, we note the many different logics regarding what is permissible in education, as well as how teachers are to be able to speak up, between such close places as Flanders and England. Thirdly, as voice becomes quantified and turned into “evidence” it takes on a life of its own as the data streams flowing from an initial 3-5 questions are transformed to take on different meanings across contexts such as Twitter, policy document, or political discussion.

As such, the notion of Teacher Tapp as a new form of voice for teachers raises important questions as to what is a professional voice, how can one speak up in relation to one’s professional life, and does Teacher Tapp fit into the ongoing digital revolution where data is increasingly fed into algorithms and forms of AI with unknown consequences. The computer knows (or pretends to know) more about us than we ourselves do. So in the end who does the speaking? The teacher who answers the survey? Education Intelligence or Arteveldehogeschool who formulate the questions and conduct cross-analysis between a range of data sources? Someone else who obtains the data from the app, Twitter, or by purchasing it from Education Intelligence and then running it through a form of AI-based analysis to determine what teachers actually want? What about the voice of the 1.1 million teachers who do not use an app like Teacher Tapp? We hope to be able to answer some of these questions through our study. Our first paper titled “One thing can be more than one thing: A comparative study of the teacher professionalization app ‘TeacherTapp’” is currently under review for publication.

 

Some links to further reading:

The website for Education Intelligence can be found at: www.teachertapp.co.uk

The website dedicated to Teacher Tapp at Arteveldehogeschool can be found at: https://sites.arteveldehogeschool.be/deleraandenkt/

For an overview of the company Education Intelligence see https://find-and-update.company-information.service.gov.uk/company/10825354/filing-history?page=2

 

¹  In more technical terms these “forms“ are what in Science and Technology Studies parlanse are called multiple enactments

²  Due to space limitations in this blog, many examples are left out. However, publications are under review that further demonstrate the points raised in this blog.

Co-authored by: Professor Leon Feinstein, Professor Geraldine MacDonald, Professor Paul Bywaters, Dr John Simmonds, Professor Karen Broadhurst, Professor Donald Forrester, Dez Holmes

A reflection on evidence and implementation

As members of the Evidence Group supporting the Independent Review of Children’s Social Care (IRCSC), a number of us have received requests to share our views on the evidence base underpinning the Review’s recommendations. In responding to these requests, our intention here is to offer a high-level and constructive perspective for those now tasked with thinking about implementation.

The first thing that should be clarified is that the Review is not a systematic review of all research evidence that might be relevant, it is a framework for policy and practice reform. Though informed by evidence, the recommendations are not all tightly linked to research evidence of intervention effectiveness – as might be the case when producing, say, NICE Guidelines. This is not a criticism. There are many types of review and it is entirely usual for policy and/or practice reforms to draw on multiple sources of knowledge (for example, from research, practitioners, families and individuals) and for the evidence base to be incomplete and/or contested. As such, the Review drew on multiple types of knowledge and evidence. Those now focused on implementation will need to consider some of the complications this approach brings.

There is much to welcome in the Review, and many have called for urgent action to ensure reforms are not delayed. The need for improvement in services and positive change for children, young people and families is widely recognised and so there is an understandable drive to ‘do something’. Given the scale of reform proposed, there is an equally strong argument for thinking carefully about a number of the issues raised before progressing at pace. The Review was undertaken in a relatively short time-scale, working to a very broad scope and with an ambitious goal of system change. Implementation colleagues will need to recognise and grapple with the risks that result from ambiguous, conjectural or partial evidence. Taking time to interrogate the wider evidence base not reflected in the Review, to consider unintended consequences and manage interdependencies, would be time well spent. As with so many important decisions, one might approach in haste and repent at leisure.

For example, the structural reforms proposed in relation to Regional Care Cooperatives is an area where implementation colleagues will find very limited evidence to draw upon. The creation of Regional Adoption Agencies might be somewhat comparable, and the DfE-funded evaluation to date presents ‘a complicated picture’ with very mixed evidence of success against intended outcomes[1]. Given the Review’s intention to strengthen leadership and accountability, care will need to be taken that these structural reforms do not dilute local accountability mechanisms. With ever-increasing pressure on the care system, it is unclear that the mechanisms proposed have the capacity to resolve the issues within the ‘market’, as it is often referred to. As with all structural change, and particularly in light of learning from NHS reorganisation, implementation of RCCs – if this idea is progressed – will need to ensure that this does not become an expensive distraction[2]. The proposed reforms to inspection will require similar attention; much of what is presented as unpopular or unhelpful within the Review can equally be seen as essential checks and balances that are necessary within a system that exerts immense power over citizens’ lives.

The Review’s emphasis on family help is in the spirit of the 1989 Children Act and welcome to many who recognise that families in contact with children’s services too often describe a punitive approach to their difficulties[3]. As was explored in the first report from the review team, there is firm evidence of the socio-economic drivers which are associated with family involvement in child protection services[4]. Colleagues involved in implementation activity will be acutely aware that achieving a responsive and effective family help system depends less on restructuring children’s services and more on radical efforts by national government to reduce poverty, improve health, education and other services and reduce inequalities in living standards. At present, the foundational economy which is vital for family wellbeing is stretched beyond capacity. Moreover, restructuring alone, without fundamental consideration of the mission of children’s social care and changes in the power dynamic between families and services, is unlikely to bring the required change.

The proposed bringing together of Early Help with Child in Need and Child Protection is not wholly illogical; after all, support and protection are not neatly delineated. However, there are potential consequences that must be avoided: such a proposal could pull resources, expertise and the focus of attention away from family support; it could create confusion regarding existing legal thresholds and drive inconsistent practice with families. The proposals will also concern those who remember Munro’s commentary on the previous Information Commissioner’s query that “When looking for a needle in a haystack, is it necessary to keep building bigger haystacks?”[5]  Against a backdrop of concerns that professionals are missing children facing serious risk[6], could this proposal inadvertently exacerbate the situation? It might lead to an ever-widening investigative net, with decreasing resources available to do the kind of work required to develop trusting and purposeful relationships to support families. These are just some of the issues that implementation colleagues will need to grapple with.

The Review makes a number of recommendations regarding workforce, and few would argue that skilled, knowledgeable practitioners are essential to a functioning system. The proposals to develop an Early Career Framework do not have a wealth of research evidence to draw upon, and there are potential risks of creating a separate system for early career child and family social workers and adult social workers. There are some insights from the evaluation[7] of the recently disbanded National Assessment and Accreditation System, which sought many of the same benefits as the ECF.  In attending to training and practice guides we must not overlook the wider evidence that training has limited impact on practice without accompanying efforts in relation to organisational context and climate[8]. There is limited evidence that issuing prescriptive guidance has a positive causal effect on practice quality (put simply, we wouldn’t need these reforms if guidance to date had been effective), and the significant influence of supervision, leadership and culture deserve equal attention.

In the current political context, there is a risk that the kind of long-term sustainable resource needed to achieve whole system change will not be forthcoming – and so implementation could become focused on what can be done with what resource is available. Without attention to the wider interdependencies, this risks fragmenting the system further, and could lead to some recommendations being progressed with limited effect (or worse, negative consequences). What is required is not temporary support or piecemeal funding of boutique initiatives, but long-term investment. Government must act as a whole system itself if it desires system change for children and families; this requires government departments to share ownership of complex and intersecting social issues and ensure the wider infrastructure which supports family life does not further decline.

Ultimately, evidence can only address so many issues. For the Review to achieve its intentions of improving the experiences and outcomes of children, young people and adults who encounter social care, it will be vital in our view that rigorous attention be paid to rights. Many of those with current, or with previous, experience of social care services represent some of the most marginalised and simultaneously scrutinised in society; people whose voices and preferences have been overlooked for too long and for whom there has been a high degree of surveillance but not enough support. Proposed reforms, included those relating to the use of data, should be subject to assessment of their impact on equalities so that they do not inadvertently erode or undermine rights of children and adults.

Lastly, colleagues focused on implementing the Review’s recommendations may be interested in recent research focused on the implementation of policies and practices within health systems, which identified that trusting relationships – those characterised by empathy, authenticity and collaboration – seem to be key to effective implementation[9]. This suggests that to successfully lead the proposed change, government must position itself as an enabler to the sector, exercising humility and a collaborative spirit. Policy reform, like good social work, requires more than passion for change. It requires critical thinking, skill, judicious use of evidence, and is something can only be ‘done with’ and ‘not done to’ those it is seeking to influence.

 

[1] https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/1057530/Evaluation_of_regional_adoption_agencies_-_final_report.pdf

[2] Walshe (2010) Reorganisation of the NHS in England. BMJ. 341:c3843

[3] See for example, Featherstone, B., Gupta, A., Morris, K. & White, S. (2018) Protecting Children: A Social Model. Bristol: Policy Press.

[4] See for example, Bywaters, P. and Skinner, G. (2022). The Relationship Between Poverty and Child Abuse and Neglect: New Evidence. Nuffield Foundation.

[5] Information Commissioner (2005) Evidence Given to Select Committee for Education and Skills, House of Commons, London.

[6] Child Safeguarding Practice Review Panel (2022). Child Protection in England. HM Government. Available: https://www.gov.uk/government/publications/national-review-into-the-murders-of-arthur-labinjo-hughes-and-star-hobson

[7] https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/938083/NAAS_delivery_evaluation_of_phases_1_and_2.pdf

[8] Burke, L. A., & Hutchins, H. M. (2007). Training transfer: An integrative literature review. Human resource development review, 6(3).

[9] Metz, A., et al. (2022) ‘Building trusting relationships to support implementation: A proposed theoretical model’ Frontiers in Health Services. Vol 2.  https://www.frontiersin.org/articles/10.3389/frhs.2022.894599

 

Alumni of the Masters in Educational Assessment, Lorena Garelli and Kevin Mason presenting their dissertation research.

Trinity’s School of Education and the Educational Research Centre, Drumcondra, hosted AEA-Europe’s Annual Conference on 9-12 November in Dublin, Ireland.

Over 350 attendees from 37 countries reflected on the conference’s theme – “New Visions for Assessment in Uncertain Times.” This diverse range of attendees included over 15 folks affiliated with OUCEA. Throughout the conference, attendees explored possible directions for assessment policy and practice in schools, higher education, and vocational/workplace settings over the coming years. Much of the reflection centered on the instability of the recent past – the pandemic, war in Ukraine, and economic challenges globally have created a sense of uncertainty in all spheres of life. As a result, attendees took stock and reimagined assessment in a world where the certainties of the past decades have given way to a more uncertain environment.

Keynote speeches addressed such diverse topics as “Assessing learning in schools – Reflections on lessons and challenges in the Irish context,” “Assessment research: listening to students, looking at consequences,” and “Assessment research: listening to students, looking at consequences.”

In addition to the keynotes, the conference hosted panel and poster presentation opportunities. Many members and associates of the OUCEA shared their research. For example:

Honorary Norham Fellow

  • Lena Gray – presented on assessment, policymakers, and communicative spaces – striving for impact at the research–policy interface

Honorary Research Associate

  • Yasmine El Masri – an OUCEA Research Associate – presented on Evaluating sources of differential item functioning in high-stakes assessments in England

Researcher

  • Samantha-Kaye Johnston – an OUCEA Research Officer – presented on Assessing creativity among primary school children through the snapshot method – an innovative approach in times of uncertainty.

Current doctoral students

  • Louise Badham – a current D.Phil Student – presented on Exploring standards across assessments in different languages using comparative judgment.
  • Zhanxin Hao –  presented on The effects of using testing and restudy as test preparation strategies on educational tests
  • Jane Ho  – presented on Validation of large-scale high-stakes tests for college admissions decisions

MSc in Educational Assessment graduates and students

  • Kevin Mason – presented on Assessment of Art and Design Courses using Comparative Judgment in Mexico and England
  • Lorena Garelli – presented on Assessment of Art and Design Courses using Comparative Judgment in Mexico and England
  • Joanne Malone – presented on Irish primary school teachers’ mindset and approaches to classroom assessment
  • Merlin Walters – presented on The comparability of grading standards in technical qualifications in England: how can we facilitate it in a post-pandemic world?

As you can see from the wide-ranging topics covered, OUCEA is engaging in wide-ranging research. The team looks forward to presenting more of our work at AEA-Europe’s 2023 conference in Malta.

Dr Neil Harrison, Deputy Director

For some years now, prospective students applying through the UCAS system have been given the option of declaring whether or not they are care-experienced.  Aside from helping statisticians, this self-identification information is passed confidentially to their university when they join to help them to target additional support such as bursaries, accommodation, academic help and mental health interventions.

There has been concern about how effective this system is.  For example, we know informally that some care-experienced students are reluctant to tick the box as they are worried about stigma or that it will negatively impact on their university application.  Some applicants may not realise that they were in care if they were young or if it meant living with relatives in a kinship care arrangement.  Furthermore, not all students enter higher education through the UCAS system.

Anecdotally, there are also some people who tick the box when they are not care-experienced.  These applicants may not understand the question – perhaps think it’s about caring for other people – or tick it by accident.

 

False positives and false negatives

There are thus two issues.  The ‘false positives’ who say they are care-experienced when they are not; these create a bit of extra work (to do the checking) and are potentially a source of error in statistics.  However, the ‘false negatives’ are more concerning.  These are students who should be entitled to additional support from their university, but who are not getting it because their university doesn’t know they are care-experienced.  It is obviously useful for policy and practice to know how many false positives and negatives there are.

The data that we’ve assembled for one of our projects has enabled us to shine a partial light on the self-identification data.  It doesn’t completely answer the questions as there are significant gaps in the data we have – we will touch on these later.  However, it does give us some useful clues for the first time which we thought it would be useful to share informally.

 

Exploring the data

We have anonymous data for England relating to the cohort of people born in the 1995/96 school year and who remained in England between 11 and 18 – about half a million in total.  We have been able to link data over time to combine care histories from the age of 8 (when the national data begins) and higher education up to the age of 21.  Therefore, we know (a) whether the student’s university believes them to be care-experienced based on self-identification, and (b) whether they had indeed been in care.

To complicate matters, the university can allocate the student to one of two care-experienced categories.  The definitions for these are very unclear, but we believe they are broadly intended to represent care leavers (meeting the statutory definition) and other care-experienced students.

 

Table

 

The table above summarises what we have found, based on the data that were held at the end of the student’s first year.  There isn’t space here to cover everything, but some basic observations:

  1. It’s clear that universities are not collectively using the two care-experienced markers appropriately, with nearly half of care leavers are actually recorded in the ‘wrong’ category. The national data is therefore poor at differentiating between statutory care leavers and other care-experienced students.
  2. However, about 85% of statutory care leavers are being appropriately classified as care-experienced through self-identification. The other 15% are split between those stating that they are not care-experienced (i.e. false negatives) and for whom the data are missing (perhaps due to refusal).
  3. The system is also reasonably good at identifying other students who were in care after the age of 14, with 75% self-identifying, although 17% had stated that they were not care-experienced and 32% have been wrongly classified as statutory care leavers.
  4. However, students who were in care between the ages of 8 and 14 were much less likely to self-identify as care-experienced – only 28% did so, with over half explicitly saying that they were not care-experienced.
  5. The ‘children in need’ group are not care-experienced (having been allocated a social worker, but not entered care), but there was a small proportion (3%) who had self-identified as such (i.e. false positives).
  6. The same was true for the general population. The proportion was very small, but this represented over 500 individuals.  Some of these are undoubtedly false positives, but others may have been in (and left) care before the age of 8, including those adopted from care.

 

Implications for policy and practice

This small piece of analysis is not intended to be the final word and it is limited in some important ways.  For example, we only have higher education data up to 2016/17 and the situation has almost certainly improved somewhat since then, with markedly more attention on care-experienced students over the last five years.  We also only have data on younger students aged between 18 and 21, so the situation may be different for those entering higher education at a later stage.  However, there are some useful lessons from the data:

  • Firstly, the way in which data is being recorded by universities varies widely and this is likely to be leading to confusion, both in the provision of support and in understanding who is entering higher education.  I am aware that the Office for Students is currently seeking to address this with the Higher Education Statistics Agency, UCAS and universities, which is a very positive step.
  • Secondly, there is clearly some degree of incorrect self-identification – this is likely to be mainly accidental and probably reflects misunderstanding about what constitutes ‘care’ in this context. Nevertheless, this does mean that the self-identification data cannot be taken at face value and does need to be subject to confirmation by universities, creating a small administrative burden to ensure that support is correctly directed at those entitled to it.  This requires universities to have a good understanding of care and a mechanism to enable students to evidence their status as sensitively as possible.
  • Thirdly, a sizable proportion of care-experienced students of various categories are being missed by the self-identification system, especially among those who left care prior to their teenage years.  This suggests that there is much more work to be done to ensure that care-experienced students are aware of the benefits of self-identifying and feel able to do so without stigma.  Clearly, however, they must always have the right to not share this information about themselves if they prefer – or to do so at a later date.

A positive development in recent years is that many universities have broadened out their support – extending out beyond statutory care leavers and removing age thresholds.  This is to be welcomed as it is not just younger care leavers who experience educational disruption and who can benefit from additional help to enter, and thrive within, higher education.  These data suggest, however, that there is still work to be done to reach all those who are entitled to receive it.

Leon Feinstein

I am delighted to chair the evidence group for Josh MacAlister’s review of the care system, described by the Secretary of State who launched it as a “wide-ranging, independent review to address poor outcomes for children in care as well as strengthening families to improve vulnerable children’s lives.”

Josh MacAlister and the review team published their opening position on Thursday 17th June, a statement on the case for change.

The review has heavy billing, not least as the level of government borrowing is higher now than when a previous Conservative chancellor demanded austerity. The current administration may be more inclined to spend but will they spend on children outside of universal services?

The review is also in the shadow of the recent review of the system in Scotland, which was much longer, was evidently led by those with experience of care, and reported into a devolved administration that has a clear articulation of its commitment to deliver the rights of children.

The pressure is certainly on. The case for change sets out Josh’s interpretation and that of the review team of what they have heard so far, in listening to and reading the evidence of personal testimony, academic research, expert views and from responses submitted to the review. From this, the case for change indicates where the review team think the system needs to change.

It is very welcome that the review is publishing the case for change so that everyone with an interest knows where the review will focus and is able to respond on the more specific issues. I know that Josh and the team are very open to all responses on these questions and I know that they are listening. People can respond here.

There are two old cliches about how those outside positions of power in government might best engage in the business of government. One cliché is the lift test, “what do you do if you find yourself in a lift with a senior figure?” How to cut through, what to say, how to be heard? The second cliche concerns a train leaving the station. You don’t get to decide when the train runs, your choice is whether to ride the train.

The Evidence Group is one of three groups providing support to the review. Of particular importance is the Experts by Experience Board, there to ensure a voice in the review for those who have had a social worker (either themselves or a child in their care). The Design Group will help guide how the review designs its recommendations.

In my work I have tried to bring good evidence to bear on policy and practice and help ensure it is used meaningfully and accurately. To do this we need a clear idea about what we mean by good evidence, by what counts as evidence, for who, about what, applied how? We might call this an episteme, a framework of agreement about what counts as knowledge and how it should be interpreted, which allows us to settle on truths or at least determine what we mean by truthfulness, in how we answer research questions intended to inform decisions.

As chair of the evidence group, I can say that the review team have had access to a great deal of high quality evidence of multiple sorts on multiple questions. In the time available Josh and the team have made their interpretation of what it says about what should change in the care system, focusing particularly on the side of problems and issues requiring attention, rather than the many daily successes and positive outcomes that make up so much of existing practice and experience.

The members of the evidence group appointed by the review have submitted their views on the review team’s reading of the evidence and on the team’s interpretation and representations of it. Ultimately the case for change is not primarily an evidentiary paper in the sense of being set up as a research or science project with a clear technical methodology to address a narrow scientific or social scientific question. It isn’t subject to formal peer review and approval in the way that a National Statistic or an academic paper might be. Neither I nor the other group members get to sign off the document. It is ultimately the view of Josh and his team and that is in part what is meant by an independent review. Another reviewer might have looked at the evidence differently and made a different case or called for different changes.

I hope it leads to a fruitful discussion. For what it is worth I think the field suffers from a lack of agreement about what counts as good evidence. Because of the nature of the evidence as yet available and the diversity of views on it, many of the issues in the case for change are subject to considerable uncertainty and disagreement so it is likely that debate will continue.

I don’t think the question of the appropriate balance between statutory care of children and wider support to families is resolved by the evidence available, nor do we know enough in aggregate about what structures best help people provide the right supports to which groups of children at the right times. I agree it is good to have a debate about these things. The available evidence can inform and there will be more evidence gathering in the next stages of the review.

I hope that the review goes on to make valuable and effective recommendations that address many of the issues and challenges raised in evidence to the review and that these lead to real improvements to the experiences of children, families and care experienced people. I hope that the review is able to address the clear call from those with care experience to be heard, not just in the review, but in perpetuity. Finally, I hope the review addresses the need to improve knowledge and understanding both in terms of about how the care system might be improved but also in helping the public and hence government recognise the work of and hear the voices of care experienced people, children, social workers, carers, directors of children’s services and others who are too often drowned out of the public debate.

We will all have differing views on all of this. I hope we will have more blogs in the weeks ahead.

Read more about the case here.

Knowledge as the Anchor

In a recent conversation with one of our Derbyshire Attachment Aware Schools (AAS) they said – ‘AAS has been our anchor during this COVID pandemic storm’ . They discussed with me how the knowledge gained through the Attachment Aware Schools programme about attachment, trauma, brain development, reactions to our world and life events, had really benefited their school. They stated how potent being an AAS school has been, in terms of their understanding and ability to deal with the challenges currently being faced in education. The AAS knowledge has given them a strong foundation to build their recovery curriculum upon, and they report how stabilising this has been at a time of such unprecedented complications to the life and routines in school and the world at large.

This school and many other AAS schools in Derbyshire have reported feeling more resilient and confident to support the children and staff in their setting to make the ‘best’ of a really difficult time, personally and professionally.

“We felt as well prepared as we could be when the children returned after lockdown because of being on the Derbyshire AAS programme. We knew we would see challenging behaviour, and other changes, and felt we had a secure knowledge base with which to address any issues as they arose.”

The Derbyshire Attachment Aware Schools Programme

AAS is designed to work with schools and settings to explore human development and behaviour, and how this affects learning. The programme fills an identified gap in human development and relational practice that many teachers and school staff express they did not experience in their initial professional training. AAS enables schools to re-examine their practice, policies and systems to develop a whole school ethos where relationships are truly understood to be the cornerstone of learning.

“Being part of the AAS programme and schools’ network has given us the confidence to look at our whole school’s provision- prior to the pandemic, and now moving out of lockdown, and to think carefully about what we want to retain/reinstate in the future. We can look at all of this through a trauma-informed perspective and think about what’s really best for our school community.”

So what is an Attachment Aware School ?

The Attachment Aware Schools programme offered to schools in Derbyshire, is a whole school learning and development programme. Using attachment theory and neuroscientific knowledge as an underpinning theoretical framework, we explore behaviour and the impact that poor early life and traumatic experiences can have on human growth, learning and development. Schools on the programme deepen their understanding of human behaviour and relationships through a yearlong series of taught inputs and supported action research projects. The learning journey is designed to help schools focus on the unique set of circumstances that constitute their school community and how best to address the needs and challenges that will inevitably arise in an intergenerational working community.

“Our increased understanding of attachment needs has influenced policy, systems and most importantly support for our students at every level. The whole ethos of school has changed. We now have the resilience to take risks and support each other to meet the challenges of our most vulnerable students.”

Ethos, Mindset and the Golden Thread

Our AAS programme is designed to develop mindset, ethos and practice in schools and education settings. It is not a toolkit of prescribed interventions or practices; as helpful as resources can be, they don’t always have the sustainable impact that we know schools want and need to truly embed and maintain new and more effective ways of working. Our mission is to help bring about a renewed understanding and approach to behaviour to maximise the potential and outcomes of children, young people and, in fact, all those who work and learn together in education.

The ‘golden thread’ that holds all of our ‘graduate’ schools together in our AAS network across the county is an understanding of the importance of building and maintaining good relationships: young person to adult, young person to young person, and adult to adult. Placing this understanding of the impact of human dynamics at the heart of school ethos and practice to build a safe and nurturing learning environment where all learners, and their educators can flourish.

What has been the impact to date?

We have seen improvements in:

  • Relationships in school
  • School experience for pupils
  • Levels of anxiety, stress and worry
  • Effectiveness of policies and communication systems
  • Staff attitudes to work
  • Student behaviour – lower level of incidents and disruption
  • Academic progress and attainment
  • Attendance

…and best of all – better relationships and a deeper understanding of the needs of children, young people and colleagues, to ensure the best experience and outcomes in every school day.

Lizzie Watt
Attachment Aware Programme Lead
lizzie.watt@derbyshire.gov.uk

Image attribution: Anchor created by freepik – www.freepik.com

Dr Neil Harrison, Deputy Director of the Rees Centre

The team in the Department for Education (DfE) that produces statistics on progression to higher education have really upped their game recently.  Starting with a trial last December, they are now publishing an annual digest of statistics looking at a wide range of demographic and educational groups, helpfully including a backwards time series.  The latest of these digests was published a couple of weeks ago and covers the 2018/19 academic year.  Importantly, these statistics are based on linking – at the individual level –  the data collected by universities with that collected by schools and colleges, providing a rich lens to understand inequalities in the system.

Interestingly, one of the groups explored is care leavers.  I have written before about issues with the statistics produced from the data collected by local authorities (the so-called ‘SSDA903’ data) and the new DfE digest represents a significant step change as it reflects definitive records about who has gone on to higher education, including in further education colleges and private providers.

It’s also important to note that the definition of ‘care leaver’ used is slightly quirky, in that it is not the statutory one.  The definition used for analysis is those children in care continuously for the 12 months up to 31st March in the academic year when they turned 16 (i.e. Year 11 for the vast majority).  In other words, the definition captures only those with a good degree of stability, although they may have changed placements in this time.  It effectively excludes most of those entering care at 14 or 15.

What do the new statistics say?

The statistics in the digest reflect progression to higher education by the age of 19 – i.e. allowing for one ‘gap’ year after school/college.  There are issues with this that I will return to shortly.  The data focused on English young people, but includes (most) higher education elsewhere in the UK.  For the purposes of this blog post, I’ve brought together several of the groups covered by the digest into the time series chart below:

We look first at the blue line representing care leavers.  The progression rate for 2018/19 was 13%.  This is more than double the oft-(mis)quoted 6% figure that comes from the SSDA903 dataset and I am confident this is much more a realistic reflection of the situation.  There has been a pretty steady rise from 9% in 2009/10, with a couple of one year blips, which is also good news.  This fits well with what universities say – I hear many reports of a year-on-year growth in care leavers and other care-experienced students.

However, the yellow line shows the situation for young people who are not care leavers and this starkly demonstrates a persistent inequality – the progression rate for this group was 43% in 2018/19.  If anything, the gap between the blue and yellow lines has widened slightly over the ten years of the time series, from 25 percentage points in 2009/10 to 30 percentage points in 2018/19.  This is worrying, as it suggests that care leavers have not been able to expand their ‘share’ of higher education at the same rate as other young people.

As I discussed in my 2017 ‘Moving On Up’ report, it is important to remember that there are strong explanatory factors at work and when you compare care leavers with similar demographic and educational profiles, much of this difference disappears.  For example, care leavers are significantly more likely to have special educational needs which impact on their attainment and therefore on their ability to pursue higher education – at least in the short term.  We will almost certainly never be in a position to eliminate the gap, but we should collectively be aiming for these lines to converge over time.

How do care leavers compare to other disadvantaged groups?

The green line represents young people who were eligible for free school meals when they were in Year 11.  There are, once again, issues with this definition and what it means, but this is a useful broad proxy for children who grew up in economically disadvantaged households.  The 2018/19 progression rate for this group was 26% and therefore double that of care leavers.  Again there has been a widening of the gap across the time series, from 10 percentage points to 13 percentage points.

Finally, the red line – for which only four data points are available – represents children designated as being ‘in need’ on 31st March in the academic year when they turned 16.  Interestingly, the higher education progression rate for this group is actually slightly lower than for the care leaver group – e.g. 11% in 2018/19.

This is consistent with other analysis, including the Rees Centre’s recent report (with the University of Bristol) looking at educational outcomes for children in need.  More research is needed to understand this fully, but it suggests that long-term and stable care placements – often, if not always – support progression to higher education in comparison to other young people experiencing profound challenges within their birth family.

Why is looking at progression at age 19 an issue?

All quantitative analysis of social data is driven by definitional issues.  These are rarely neutral or objective – you have to decide what groupings to use, how you determine the boundaries and so on.  As discussed, the new DfE digests use a particular definition of a care leaver – if they used a different definition, the analysis would yield different results.

One decision is about time cut-offs.  This is always tricky.  The longer timeframe you look at, the less reliable the historic data become – if they exist at all.  The DfE’s cut-off at the age of 19 is a longstanding one and makes sense for the general population who most commonly progress immediately after school/college or after a gap year.

However, as I’ve shown elsewhere, this does not hold for care-experienced students.  The social and educational disruption they undergo as a result of their care journeys means that they are often not qualified or ready to pursue higher education at 18 or 19.  In fact, most that do go to university, do so in their 20s or even later in life.  We don’t yet know for sure, but it is likely that something like 25-30% of care-experienced people will undertake higher education at some point in their life.

This is still not high enough, but the DfE digest – useful as it is – can only ever be part of the story and the blue and yellow lines would be closer if a longer timeframe were used.

A final note…

It is always important to remember that progression into higher education is only one side of the coin and that there is good evidence that care leavers and other care-experienced students are at greater risk of leaving higher education early.  It would be great to see some official figures from the DfE on this at some point, to help us to understand the scale of the problem.

Contact Neil: neil.harrison@education.ox.ac.uk