Department of Education

Viewing archives for Academic Staff

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the validity and reliability of educational assessments, to promote the role of assessment in improving educational outcomes, and to strengthen the connection between substantive and methodological theories.

Before coming to Oxford, Jo-Anne held academic posts at the Institute of Education, University of London and the University of Bristol.

Jo-Anne previously held the position of Head of Research at the Assessment and Qualifications Alliance, where she managed the research programme and was responsible for the standard-setting systems for public examinations. Her first degree and doctorate were in psychology and she has an MBA.  Her current research projects include Setting and Maintaining Standards in national examinations, Examination reform: the impact of modular and linear examinations at GCSE, Assessment for Learning in Africa (AFLA), intelligent accountability and the Progress in International Reading Literacy Study England national centre.

Her research interests are in educational assessment, including system-wide structures and processes, examination standards, marking and assessment design.  Jo-Anne conducts a lot of work with government and industry partners, including acting as the Standing Adviser to the House of Commons Education Select Committee, a member of Ofqual’s Standing Advisory Group and membership of the Welsh Government’s Curriculum and Assessment Group.  She is a member of the Editorial Board of the Oxford Review of Education journal and the International Advisory Board of Assessment in Education: principles, policy & practice.  She has been a Visiting Professor at the universities of Bergen, Queen’s (Belfast) and Umea.  From 2013 to 2015 she was President of the Association for Educational Assessment – Europe.

Samantha-Kaye Johnston is a Research Officer at the Oxford University Centre for Educational Assessment (OUCEA).

Samantha-Kaye was formally educated in Jamaica, where she completed her Bachelor of Science in Psychology. In England, she received her Master of Arts in Education and then completed her Ph.D. in Psychology in Australia. Using a cognitive psychology lens, Samantha’s expertise and interest lie at the intersection of education and psychology. She aims to link these areas with evidence-based e-learning technologies to improve teaching, learning, and assessment outcomes.

Samantha has 10+ years of experience in the project management sector, where she has been actively involved in education development initiatives. In 2016, as part of her Project Capability, she founded the Marlon Christie scholarship, which provides a scholarship for Jamaican students with reading difficulties to attend university. As an extension of this project, Samantha founded Reading for Humanity, to elevate the science of reading, the science of learning, and the science of technology within the classroom. Her work is informed by her experience as an advocate and researcher in Jamaica, England, and Australia, primarily within the K-12 sector, as well as within non-governmental, private, community organisations, and United Nations bodies.

She has experience as a University Associate at Curtin University and Teaching Associate at Monash University, as part of their undergraduate and graduate psychology teaching teams. Within this space, she has been teaching and/or assessing various psychology units, including Introduction to Psychology, Developmental Psychology, Science and Professional Practice in Psychology, and Indigenous and Cross-Cultural Psychology.

During her time in the ed-tech sector, and in collaboration with UNESCO’s Future of Education Initiative, she conceptualised and spearheaded Project Seat-at-the-Table (Project SAT), an international qualitative research initiative that aimed at providing primary and secondary school students with the opportunity to provide their input on the future of technology in their education. As an affiliate at the Berkman Klein Centre for Internet and Society at Harvard University, Samantha’s seeks to strengthen internet governance within online learning. In particular, she is interested in ensuring that the rights of young students are protected while they interact within the digital space, including elevating the voices of students in decision-making processes.

Above all, Samantha believes that every child should have the same opportunity to shape their destiny, emphasing that we cannot always build the future for them, but we can build them for the future. Consequently, her goal is to ensure that teachers implement evidence-based pedagogical approaches that will strengthen 21st-century skills, including, critical thinking and creativity, in all students.

David Andrich is Chapple Professor, Graduate School of Education, The University of Western Australia.

He obtained a bachelor degree in Mathematics and his Masters degree in Education from The University of Western Australia and his PhD from the University of Chicago, for which he was awarded the Susan Colver Rosenberger prize for the best research thesis in the Division of the Social Sciences. He returned to The University of Western Australia, and in 1985 was appointed Professor of Education at Murdoch University, also in Western Australia. In 2007 he returned to The University of Western Australia as the Chapple Professor of Education. In 1977 he spent 6 months as a Research Fellow at the Danish Institute for Educational Research working with Georg Rasch and he has been a Visiting Professor at the University of Trento in Italy for two periods. He has held major research grants from the Australian Research Council continuously since 1985 and has conducted commissioned government research at both the national and state levels. In 1990, he was elected Fellow of the Academy of Social Sciences of Australia for his contributions to measurement in the social sciences. He is especially known for his work in modern test theory, and in particular Rasch models for measurement, ranging in topics from the philosophy of measurement, through model exposition and interpretation, to software development. He has published in Educational, Psychological, Sociological and Statistical journals. He is the author of Rasch Models for Measurement (Sage) and coauthor of the software package Rasch Unidimensional Measurement Models (RUMMLab).

Research

David Andrich’s current research in applying Rasch models for measurement is has two strands.

The first involves articulating a research and assessment paradigm that is different from the traditional in which statistical models are applied. In the traditional paradigm, the case for choosing any model to summarise data is that it fits the data at hand; in contrast, in applying the paradigm of Rasch models, the case for these models is that if the data fit the model, then, within a frame of reference, they provide invariance of comparisons of persons with respect to items, and vice versa. Then any misfit between the data and the chosen Rasch model is seen as an anomaly that needs to be explained by qualitatively by reference to the theory behind the construction of the instrument, and the operational aspects of its application. He argues that this approach improves the quality of social measurement, including in education, psychology, sociology, economics and in health outcomes. The second area of research is further articulating the implications of the Rasch models and development of complementary software, to better understand a range of anomalies, for example, how to identify guessing in multiple choice items, how to identify and handle response dependence between items, and mutldimensionality. He has also recently published the paper which shows how person location estimates can be obtained independently of all test parameters using the general unidimensional Rasch model in the case where each person has sat a multiple of tests, for example for selection for university entry. Andrich, D. (2010) Sufficiency and conditional estimation of person parameters in the polytomous Rasch model. Psychometrika. (Online First Publication).

Therese N. Hopfenbeck is professor of Educational Assessment, Director of the Oxford University Centre for Educational Assessment and fellow at Kellogg College. She is the Course Director of the Master in Educational Assessment at the Department of Education, elected Vice-President of The Association for Educational Assessment-Europe and Lead Editor of the journal Assessment in Education, Principle, Policy and Practice.

Dr Hopfenbeck’s research agenda focuses upon bridging research on self-regulation and classroom-based assessment and making sense of international large-scale studies in education. In collaboration with Professor Nancy Perry, University of British Columbia, she is currently leading an international network of researchers disseminating classroom-based research, funded by Social Sciences and Humanities Research Council of Canada (2020 – 2021). She is also currently Principal Investigator for two research projects funded by IB, on critical thinking in PYP schools internationally and evaluation of education reforms in Kent, UK (2020 – 2021). In 2020, she led the research on Critical Thinking in the Diploma Program in Australia, England and Norway (https://ibo.org/research/outcomes-research/diploma-studies/critical-thinking-skills-of-dp-students/). Dr Hopfenbeck is also Principal Investigator for the PISA 2022 study in England, Northern Ireland and Wales, in collaboration with Pearson UK (2018 – 2023). She was the Research Manager of PIRLS 2016, funded by The Department of Education, UK.gov, and was Principal Investigator of a major ESRC-DFID research study, Assessment for Learning in Africa (ES/N010515/1) (2016 – 2019). Since coming to Oxford in 2012, she has been the recipient of funding from ESRC-DFID, OECD, The Norwegian Research Council, Education Endowment Foundation, State Examinations Commissions Ireland, Jacob Foundation and the International Baccalaureate totalling more than £2 mill in addition to a single grant of £4 mill in collaboration with SLATE: Centre for the Science of Learning & Technology at the University of Bergen, Norway. Prior to her appointment at Oxford, she worked as a post-doctoral researcher at the University of Oslo’s research group for Measurement and Evaluation of Student Achievement at the Unit for Quantitative Analysis of Education (2010 – 2011).

She is Adjunct Professor of the Norwegian University of Science and Technology (NTNU), member of the Visiting Panel for Research at the Educational Testing Service (ETS) in Princeton, chair of Ofqual Research Advisory Board in UK (2021 – 2023) and expert member of the PISA 2022 Questionnaire Framework group, appointed by ETS and OECD (2014 – 2023). She has advised on the implementation of formative assessment programs in India, South Africa, Norway and the Emirates and carried out policy work for UNESCO/OECD and the Norwegian Ministry of Education Norway.

Therese has a presence on LinkedIn, ResearchGate, Academia.edu and Twitter: @TNHopfenbeck.

She welcomes students in the following areas

· Self-regulated learning/Metacognition

· Assessment for Learning/formative assessment

· International large-scale assessment (PIRLS, PISA)

· Classroom-based Assessment

· Implementation and evaluation of Assessment reforms

Dr Joshua McGrane is Associate Professor and Deputy Director of the Oxford University Centre for Educational Assessment. He completed his university medal-winning PhD in Quantitative Psychology at the University of Sydney.

He has been a Postdoctoral Fellow in the Graduate School of Education at the University of Western Australia and also worked as a Psychometrician for the Centre for Education Statistics and Evaluation (CESE) in the New South Wales Department of Education. This has provided him with extensive experience in academic and government contexts in education, including psychometric analyses and innovation of state and national-level educational assessments. At OUCEA, Josh’s Research Fellowship is funded by AQA. His research and teaching interests span the conceptual, empirical and statistical aspects of psychometrics and educational assessment. This includes questioning and investigating the key philosophical assumptions of educational and psychological measurement, as well as their historical development.

His empirical research employs a range of psychometric models, in particular the Rasch model, for scale development and validation of cognitive and non-cognitive assessments. These include achievement tests, learning progressions, performance assessments and social attitude surveys. He is committed to the use of innovative methods to enhance the vali