Skip to main navigationSkip to main content
The University of Southampton
Modern Languages and LinguisticsPart of Humanities
Phone:
(023) 8059 7650
Email:
Ying.Zheng@soton.ac.uk

Dr Ying Zheng BA, MA, MEd, PhD

Associate Professor, Director of the Confucius Institute

Dr Ying Zheng's photo

Dr Ying Zheng is an Associate Professor in Modern Languages and Linguistics at the University of Southampton and Director of the Confucius Institute.

Before joining the University of Southampton in 2013, Ying worked as a Psychometrician and Director of Research (2009-2013) in the Language Testing Division of Pearson, London, a world leading multinational education company. She obtained her MEd (2005) and PhD (2009) in Cognitive Studies from Queen's University, Canada, specializing in Second Language Testing and Assessment.

Research

Publications

Teaching

Presentations

Contact

Research interests

My research interests include psychometric analysis of language testing data, ESL/EFL learner characteristics, and quantitative research methodology. My area of work has covered language tests from China, Canada, and the UK, including the College English Test in China, the Ontario Secondary School Literacy Test in Canada, and the Pearson suites of language tests, particularly Global Scale of English (GSE),  Pearson Test of English Academic (PTE Academic) and Pearson Test of English General (PTE General).  

I’m currently also interested in Mandarin Chinese teachers’ professional development, learner motivation, and language teaching pedagogy. 

I welcome PhD research proposals in the above areas, especially in the area of language assessment/testing in English or in Chinese.

 

Consultancy roles

I have been working as an external psychometric consultant for language testing organizations. Projects that I have consulted on include:

  • Rwanda project data analysis by Assessment Research Group, British Council
  • Henoi project data analysis by Assessment Research Group, British Council
  • Pearson Test of English Academic Item Bank Recalibration
  • Standard Setting: Putting a Polish English Test onto Global Scale of English(GSE)
  • Calibrating Pearson Progress Test: Aligning to Global Scale of English(GSE)
  • Calibrating Descriptors for Young Learner English and Professional English
  • Aptis for Teens: Analysis of Pilot Test Data
  • Investigating English Language Levels for the Teacher Development Programme (TDP) in Three Northern States in Nigeria

Current Research Projects  

 

Project title: An investigation of item type efficiency and construct relevance in the case of Pearson Test of English Academic (2018-2019)

Funded by: Pearson Education 

This two-stage study (2018-2019) will use the parameters of test construct relevance and item efficiency to examine speaking and reading item types on PTE Academic. The purpose of the study is to inform the design of the next generation of PTE Academic using statistical evidence to model item difficulty, item discrimination, as well as item efficiency and quality.

 

Project title: HSK test taker characteristics, test performance and implications for HSK test constructs

Funded by: Chinese Testing International

 

This project (2018-2019) will use the global Mandarin Chinese testing data to model the relationship between HSK test performance and test taker characteristics to see if any patterns can be identified to inform test taker behaviour and test constructs. This project also aims to compare the identified patterns to those found in English tests, and explore the reasons behind those differences to seek pedagogical implications. This project is in collaboration with two co-investigators, Dr. Yang Lu from Nottingham University and Dr. Clare Wright from Leeds.

 

Project title: Investigating Mandarin Chinese Teachers' Professional Development Needs in the UK

Project title: An investigation into the motivations of learners of Chinese as a foreign language

         

Completed Funded Research Projects

 

Project title: Assessing Language Progress: How to Measure, and What to Compare to? (2017-2018)

Funded by: Pearson Education

This project will recruit this university’s pre-sessional students from the 2017 cohort to examine their language progress via a series of tests, including in-house assessment batteries and Pearson Progress Tests. The purpose is to track progress patterns and development trajectories in relation to test takers’ individual backgrounds, and if possible, establish the concurrent validity of two testing systems.

 

Project title: Investigating the Practice of the CEFR outside Europe: A Case Study on English Writing Assessment in China

Funded by: British Council ELT Partnership Research Award 2014

This project explores the possible application of the Common European Framework of Reference for Languages (CEFR) in China and provides Chinese ELT teachers with hands-on practice of using the CEFR scales to assess university students’ English writing. Comparisons of the rating scores across levels will be undertaken between CEFR experts and Chinese ELT teachers. The researchers hope to provide an empirical evaluation of whether the CEFR scales are applicable in the Chinese context and to what extent they need to be modified.

 

Project title: Incorporating Feedback into an Online Assessment Module: Exploring Its Potential'

Funded by: Adventure in Research Grant 2014, University of Southampton

It is a collaborative project with Pearson ELT (David Booth, Test Development Director, Shaida Mohammadi, Test Development Manager). The aim of the project is to examine the extent of the impact that different modes of feedback can have on language learners' interlanguage or intake in the context of online learning and assessment modules.

 

Project title: Aptis in China: Exploring Stakeholders' Perceptions of its Validity and Practicality

Funded by: British Council East Asia Assessment Research Grants 2014

This project is a collaboration with Dr. Yanyan Zhang, Associate Professor from Wuhan University.  The project aims to investigate Aptis test takers' perceptions of its validity and test practicality in China. The Aptis test is a newly launched English test developed by the British Council.

Affiliate research group

UK Association for Language Testing and Assessment (UKALTA) - Executive Committee member

Sort via:TypeorYear

Articles

Book Chapters

Reports

I teach the following modules:

LING3014 Language Testing and Assessment in Society

LING 6001 Research & Inquiry in Applied Linguistics

LING 6007  Assessment of Language Proficiency

RESM 6004 Quantitative Methods 1

LING 6017  Research Skills (Dissertation)

 

Postgraduate Student Achievements

Recipient of British Council Assessment Research Award 2016:

Maria Georgina Fernandez Sesma

Recipient of British Council Assessment Research Award 2015:

Elsa Fernanda Gonzalez

Current PhD student supervision

Napol Artmungkun (2018-2022)

ESP Assessment in Thai Context

Ashwaq AlThowibi (2017-2020)

The Impact of Students' Assessment Literacy on their Learning Experience Abroad: a Comparative Study

Lizbeth Morales Berlanga (2017-2020)

Assessment Practices and Language Policies in Mexican EMI Programmes

Alshahad Adnan Aldereihim (2016-2019)

Assessment of Collocation Progression in Second Language Learners' Writing at Different Levels of Proficiency: a Corpus-Based Study

Ricardo de la Garza Cano (2015-2019)

Language Assessment Literacy Development among Mexican teachers of English in Higher Education: Experiences, Knowledge and Practices

Maria Georgina Fernandez Sesma (2014-2017)

Blended Learning: Exploring EFL Teachers and Students' Attitudes toward Usage and Continuance Intention to Use Information and Communication Technologies

Ada Arellano (2014-2018)

Investigating Key Elements To Design and Implement A Speaking Test To Foster Engineering Students' Oral Skills

Nadia Patricia Mejia Rosales (2013-2016)

Pre-service teachers’ cognitive development and the impact of their cognitions on teaching practice

 

Completed PhD supervision

Elsa Fernanda Gonzalez (2013-2018)

The Impact of Assessment Training on English as a Foreign Language University Professors’ Classroom Writing Assessment: Reported Practice and Perceptions 

Ibrahim Alzahrani (2013-2017)

Exploring the Effects of Language Learning Strategies Instruction on Saudi EFL College Students' Strategy Awareness and Proficiency 

Gandy Griselda Quijano Zavala (2013-2017)

Attitudinal and Motivational Factors: Performance, Attitude and Motivation Change in a Mexican University Context

Nahum Samperio Sanchez (2013-2016)

General Learning Strategies: Identification, Transfer to Language Learning and Effect on Language Achievement 

Workshops:

De Jong, J. & Zheng, Y. (2017, June). Use and interpretation of statistical analysis procedures to improve testing and assessment (Intermediate level). European Association of Language Testing and Assessment 2017, Sevre, France.

Zheng, Y. (2016, January). Statistics for Those Who Dislike Statistics. Invited by the Scientific Society for Saudi Students in the UK, Southampton, UK.

Zheng, Y. (2016, October). Statistics in Applied Linguistics. Invited by University of Nottingham Ningbo, China

Zheng, Y (2014, December). An introduction to Quantitative Methods in Applied Linguistics, invited by Wuhan University, China

 

Selected Conference Presentations (from 2010)

Lin, S. & Zheng, Y. (2018, July). Investigating Chinese Teachers' Professional Development Needs in Southern UK. Presented at British Association of Applied Linguistics (BAAL) Language Learning and Teaching SIG, Southampton, UK.

De la Garza Cano, R. & Zheng, Y. (2018, May). Assessing Language Progress: How to measure, and what to compare to? Presented at European Association for Language Testing and Assessment (EALTA), Bochum, Germany.

De la Garza Cano, R. & Zheng, Y. (2016, November). EAP assessments: A study to track test-takers' progression. Presented at the Language Testing Forum (LTF), Reading, UK.

Zheng, Y. & Zhang, Y. (2016, May). Towards Localisation of a Test: What are the Essential Considerations? Presented at European Association for Language Testing and Assessment (EALTA), Valencia, Spain.

Zheng, Y., Zhang, Y., & Yan, Y. (2015, November). Investigating the Practice of the CEFR outside Europe: A Case Study on English Writing Assessment in China. Presented at the International Conference on Language Testing and Assessment. Guangzhou, China.

Zheng, Y., Mohammadi, S., Booth, D. (2014, October). Incorporating feedback into an online assessment module: Exploring its potential. Presented at the 7th International Conference on English Language Teaching (ELT) in China. Nanjing, China.

Zheng, Y., Mohammadi, S., Booth, D. (2014, June). Linking a test to CEFR: An application of extended descriptors, Presented at Language Testing Research Colloquium (LTRC), Amsterdam, NL.

Mohammadi, S., Zheng, Y., & Bonk, W. (2014, June). Exploring the effectiveness of online feedback at different CEFR, Presented at Language Testing Research Colloquium (LTRC), Amsterdam, NL.

Colley, M., Hancock, J., & Zheng, Y. (2013, November). An investigation into raters' experience of rating different linguistic features. Presented at the 33rd Language Testing Forum (LTF), Nottingham, UK.

De Jong, J. & Zheng, Y. (2013, July). Optimizing raw score usage to reduce measurement error. Presented at the 35th Language Testing Research Colloquium on Broadening horizon: Language assessments, diagnosis, and accountability. Seoul, Korea.

Zheng, Y. & Mohammadi, S. (2013, May). An investigation into the writing construct(s) measured in Pearson Test of English Academic. Presented at the EALTA writing assessment special interest group. Istanbul, Turkey.

Zheng, Y. & De Jong, J. (2013, May). Linking to the CEFR: validation using a priori and a posteriori evidence. Presented at the International Conference in Language Testing in Europe: Time for a New Framework. University of Antwerpen, Belgium.

Zheng, Y. & Hancock, J. (2013, April). Academic English test performance: the influence of test taker background. Presented at the International Association of Teachers of English as a Foreign Languages (IATEFL). Liverpool, UK.

Zheng, Y. & Li, L. (2012, November). Test takers, test trainers, and test developers: Are they speaking the same language? Presented at the 32nd Language Testing Forum (LTF), Bristol, UK.

Zheng, Y., Jones, G., & Buckland, S. (2012, June). English Language Skills and Academic Performance: A Study of Chinese Test Takers. Presented at the 9th annual European Association for Language Testing and Assessment (EALTA), Innsbruck, Austria.

Ackermann, K., & Zheng, Y. (2012, April). Investigating item variables influencing test taker performance on multiple choice items. Presented at the 34th Language Testing Research Colloquium (LTRC), Princeton, NJ, US.

De Jong, J., & Zheng, Y. (2012, April). Native speaker responses in language test development. Presented at the 34th Language Testing Research Colloquium (LTRC), Princeton, NJ, US.

De Jong, J., & Zheng, Y. (2011, November). Optimizing raw score usage to reduce measurement error. Presented at the 31st Language Testing Forum (LTF),Warwick, UK.

Lu, Q., & Zheng, Y. (2011, November). Investigating Item Features that Influence Test-takers' Performance on 'Summarize-Spoken-Text' Item Type. Presented at the 31st Language Testing Forum (LTF), Warwick, UK.

Jones, G., & Zheng, Y. (2011, October). Test repetition or test preparation? Presented at the National Association of Foreign Language Education (NAFLE) annual conference, Beijing, China.

Zheng, Y., & Mohammadi, S. (2011, October). Writing Assessment in Pearson Test of English Academic. Presented at the Writing Assessment in Higher Education: Making the Framework work. Amsterdam, NL.

De Jong, J., & Zheng, Y. (2011, August). The Role of Native Speaker Responses in Language Test Development. Presented at the 16th World Congress of Applied Linguistics (AILA). Beijing, China.

Zheng, Y. (2011, August). Chinese University Students' Motivation, Anxiety, Global Awareness, Linguistic Confidence, and English Test Performance: A Correlational and Causal Investigation.Presented at the Symposium of Consequential validity of large-scale testing: Multiple sources of evidence from test-takers at the 16th World Congress of Applied Linguistics (AILA). Beijing, China.

Zheng, Y. (2011, June). The validity of using integrated tasks to assess listening skills. Presented at the 33rd Language Testing Research Colloquium (LTRC). Ann Arbor, MI, US.

Zheng, Y. (2011, April). Assessing Academic English listening: A multi-trait multi-method approach. Presented at the Academic Listening symposium at International Association of Teachers of English as a Foreign Languages (IATEFL). Brighton, UK.

Zheng, Y. (2010, November) Making language test score report understandable and useful. Presented at the 30th Language Testing Forum (LTF), Lancaster, UK.

Zheng, Y. (2010, November). Applying the EALTA guidelines for good practices in language testing in the development of Pearson Test of English Academic. Presented at the British Association of Lectures in English for Academic Purposes (BALEAP) Professional Issues Meeting, Nottingham, UK

Zheng, Y., De Jong, J., & Li, J. (2010, July). Establishing test validity for Pearson Test of English Academic. Presented at the International Test Commission (ITC) Hong Kong, China.

Zheng, Y. (2010, April). Knowing the test takers: Investigating test taker background, test taker characteristics, and their test performance. Presented at the 32nd Language Testing Research Colloquium (LTRC): Crossing the threshold: investigating levels, domains and frameworks in language assessment. Cambridge, UK.

Dr Ying Zheng
Building 65 Faculty of Arts and Humanities University of Southampton Avenue Campus Highfield Southampton SO17 1BF United Kingdom

Room Number: 65/3027

Share this profile Share this on Facebook Share this on Google+ Share this on Twitter Share this on Weibo

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×