Department of Education

Jon S. Twing

Honorary Research Fellow

Dr. Twing is Senior Vice President at Pearson Assessments and Honorary Research Fellow at OUCEA.

He is responsible for educational and certification assessment services globally for Pearson and has most recently been providing measurement, statistical and psychometric consultation across the globe. He believes the gap between what is known and needed in local jurisdictions around the world and best practices in education, specifically research, assessment, training and reform is staggering. He wants to help bridge this gap by linking the OUCEA and Oxford’s greater expertise to applications around the world. He feels that by working closely with the experts at OUCEA as he travels and implements consultancy and fulfillment internationally he will be able to improve learning locally while also providing opportunity for OUCEA to expand and influence education.

Dr Twing’s role encompasses the following:
• Responsible for all educational, licensure and certifications measurement decisions and policies for Pearson Assessments
• Consults with Pearson senior executives regarding learning systems, measurement and measurement research
• Responsible for expansion of Pearson learning services as they apply to measurement worldwide
• Provides senior leadership for Pearson growth strategy, publishing strategy, and business development
• Leads and guides business strategy, focusing on development, marketing, and validity/measurement rigor of potential products and services
• Responsible for several operational groups at Pearson:
• Content Development Services
• Psychometric and Testing Services
• Assessment Accessibility
• Assessment Publishing
• Provides oversight for an organization of approximately 500 content specialists, editors, assessment consultants, statistical analysts, research scientists, publishing and quality assurance specialists
• Responsible for all content development and psychometric functions (e.g., scaling, equating, DIF analyses and legal defensibility) for various assessment programs and interim and benchmark assessment globally

• Estimation of norms distributions from incomplete data
• Linking assessments when assumptions are violated
• Many-faceted performance scoring
• Legal defensibility of assessment
• Innovative and technology enhanced assessments
• Making assessments accessible to all

Professional affiliations
• National Council on Measurement in Education
• American Educational Research Association
• The Psychometric Society
• Iowa Educational Research and Evaluation Association
• International Testing Commission
• Association of Test Publishers
• American Psychological Association
• American Psychological Association, Division 5

Professional services
• Research Fellow at the University of Sydney
• Serves on The University of Iowa, College of Education Advisory Board
• Chair of the Association of Test Publishers Working Group on Operational Best Practices
• Member of the Achieve ADP Research Alliance
• Member of the MetaMetrics, Inc. Technical Advisory Committee
• Served on the Board of Directors for Pearson Knowledge Technologies
• Served on the Board of Directors for ACT Aspire
• Serves on the Board for JDRF of Eastern Iowa
• Past President, Iowa Educational Research and Evaluation Association
• Past Reviewer, Journal of Educational Measurement
• Past Reviewer, American Journal of Evaluation
• Reviewer, Educational Research Quarterly
• Reviewer, American Education Research Association Annual Meeting
• Reviewer, National Council on Measurement in Education Annual Meeting
• Discussant, American Educational Research Association Annual Meeting
• Discussant, National Council on Measurement in Education Annual Meetings
• AERA Division H Business Meeting Sponsor

• Twing, J. S., & O’Malley, K. J., (2018). Validity and reliability of direct assessments. In Cumming, T. & Miller, M. D., Eds., Enhancing Assessment in Higher Education: Putting Psychometrics to Work. Stylus Publishing, LLC.
• Tognolini, J., & Twing, J. S., Normalization of the Joint Institute of Technology Joint Entrance Examination (JEE) and State Specific Institutes of Technology (IITs) for Selection when Assumptions are Violated. Research Publication Commissioned by the Central Board of Secondary Education (India) on the behalf of the Centre for Assessment, Evaluation and Research (CAER), New Delhi, India, 2013.
• McClarty, K. L., O’Malley, K. J., Way, W. D., & Twing, J. S. A comprehensive approach to college and career readiness. Paper submitted to Educational Measurement: Issues and Practices, 03 April, 2013.
• Twing, J.S. Development of Open Technology Standards. Submitted to US Department of Education (USED) Office of Special Education Programs, in response to USED post concerning development of open technology standards for managing and delivering student assessments and assessment results, 7 November 2011.
• Twing, J.S. Assessment Technology Standards Request for Information. Submitted to US Department of Education (USED) Office of Educational Technology, 17 January 2011.
• Way, W., Twing, J.S., Camara, W., Sweeney, K., Lazer, S., and Mazzeo, J. Some Considerations Related to the Use of Adaptive Testing for the Common Core Assessment. February, 2010.
• Lazer, S., Mazzeo, J., Way, W., Twing, J.S., Camara, W., and Sweeney, K. Thoughts on Linking and Comparing Assessment on Common Core Standards. May, 2010.
• Lazer, S., Mazzeo, J., Way, W., Twing, J.S., Camara, W., and Sweeney, K.. Thoughts on an Assessment of Common Core Standards. 2010.
• Camara, W., Sweeney, K., Twing, J.S., Way, W.D., Lazer,S., and Mazzeo, J. Designing and Operating a Common High School Assessment System. April, 2010.
• Camara, W., Twing, J. S., Way, W. D., Slover, L., Sconing, J., Loomis, S., Measuring College Readiness: Validity, Cut Scores, and Looking to the Future. Panel discussion at the CCSSO National Student Assessment Conference, Detroit, 2010.
• Twing, J.S., Young, M.J., Shimko, V., and Schmidek, D. Investigating the Efficacy of Learnia in New Jersey: Phase I-Post Hoc Analyses of Effect Sizes and Correlations. January 15, 2010
• Nichols,P., Twing, J.S., Mueller, C. D. & O’Malley, K. Standard-Setting Methods as Measurement Processes. Educational Measurement: Issues and Practice, Volume 29, Issue 1, pages 14–24, Spring 2010
• Twing, J.S. Response to the Race to the Top Assessment Program Request for Input. Submitted to US Department of Education (USED) Office of Elementary and Secondary Education in response to USED questions regarding assessment design and the future needs of a comprehensive learning system, 2 December 2009.
• Twing, J.S. The “intrinsic rational validity” of an integrated education system. NCME Newsletter, June 2007, 15 (2), 8–9.
• Twing, J.S. Growth, Gains, and Progress under NCLB, Position paper, 2006. Pearson School.
• Fisher, T., and Twing, J.S. “Toward a Growth Centric Assessment Model,” a paper from Pearson Educational Measurement, April, 5-14, 2006
• Twing, J. S. & Vickers, D. Quality Assurance in Essay Scoring. Pearson Research Bulletin #2, 2006. Pearson, Iowa City, Iowa
• Schafer, W. D., and Twing, J. S. Growth Pathways as a Basis for AYP. In, Lissitz, R. W., (Ed.), Longitudinal and Value Added Models of Student Performance, 2006. JAM Press, Maple Grove, MN.
• Miller, G. E., Yoes, M. E., and Twing, J. S. (2004). Estimation of the All Tests Pass Rate When No Examinee Took All of the Tests. Applied Measurement in Education, 17 (4), 393–406.
• Miller, G. E., Rotou, O., and Twing, J. S. (2004). Evaluation of the 0.3 logits screening criterion in common item equating. Journal of Applied Measurement, 5 (2), 172–177.
• Twing, J.S. Some “Old fashioned” notions about electronic testing. NCME Newsletter, September 2002, 10 (3), 7.
• Twing, J.S. Large-scale assessment: The art of the compromise. NCME Newsletter, December 2000, 8 (4), 4.
• Cruse, K.L., and Twing, J.S. The history of statewide achievement testing in Texas. Applied Measurement in Education, 2000, 13 (4).
• Smisko, A., Denny, P., and Twing, J.S. The Texas model for content and curricular validity. Applied Measurement in Education, 2000, (4).
• Twing, J.S. Implementing a district-wide standards referenced Assessment system (DSRAS). Iowa Department of Education, 1999.
• Twing, J.S. Review of Objective Measurement: Theory into Practice. Journal of Educational Measurement, 1998, 35 (1) 82-88.
• Forsyth, R.A., Ansley, T.N., and Twing, J.S. The validity of normative data provided for customized tests: Two perspectives. Applied Measurement in Education, 1992, 5 (1).
• Forsyth, R.A., Ansley, T.N., and Twing, J.S. Three applications of customized testing in local school districts. Applied Measurement in Education, 1992, 5 (1).
• Pope-Davis, D.B., and Twing, J.S. The effects of age, gender and experience on measures of attitude regarding computers. Computers in Human Behavior, 1991, 7, 333–339.
• Vispoel, W.P., and Twing, J.S. Creating adaptive tests of musical ability with limited-size item pools. Proceedings of the Association for the Development of Computer Based Instructional Systems 32nd International Conference, 1990.
• Vispoel, W.P., and Twing, J.S., SIMUCAT: A program to simulate and score conventional item-response based computer adaptive tests. Applied Psychological Measurement, 1990, 14 (1), 108

Follow Us