Skip to main content

Predictability of IELTS in a high-stakes context: a mixed methods study of Chinese students’ perspectives on test preparation

Abstract

High-stakes language tests are used around the world as a gatekeeping tool under the internationalization of higher education. However, the predictable aspect of the high-stakes language tests is seldom discussed, especially from students’ perspectives. This study aims to address this gap by aiming to better understand how certain factors and conditions contribute to the predictability issue of IELTS from students’ perspectives within a high-stakes context. This study used a mixed method approach to investigate the views and experiences of students within a Sino-UK joint college. The data collection was in two concurrent strands: online survey and group interviews. Findings suggested that IELTS can impact students negatively by narrowing their English learning scope, driving them into self-isolated way of study, doing repeated test-taking and buying predicted answers. Implications related to language test preparation are discussed in light of the findings.

Introduction

This study will examine the impact of an international English language test on students in the context of internationalisation of higher education. The research’s aim is to look into students’ experiences and perspectives of International English Language Testing System (IELTS). The study’s participants are students within a cooperative Chinese–UK programme based in China. IELTS is endorsed as an English language test by the UK Home Office (Pearson, 2020) and is accepted worldwide by academic institutions, governments and employers as a measure of English proficiency (Dang & Dang, 2021; Pearson, 2021; Sari & Mualimin, 2021). For students in our research sample, IELTS is also a determining factor in whether they can enter their second-year courses with English as the medium of instruction and obtain the degree from the UK university; thus, its consequences on students’ learning are significant. That said, insufficient research has been carried out with regard to the impact of testing from the perspective of test-takers (Tsang & Isaacs, 2021; Elwood et al., 2017; Tsagari 2013), especially Chinese students (Dong, 2019). This study, gathering and analysing data using a mixed method approach, therefore throws new light on the perspectives of students.

IELTS is both high-stakes and predictable. A ‘high-stakes test’ is one in which the results can profoundly influence those taking it in terms of being a deciding factor in their future opportunities (Cheng, 2005). The extent to which it is acceptable for test-takers to predict what will be tested is known as test predictability (Baird et al., 2014). The effects of these two aspects of the test on students’ learning are consequently investigated via test predictability—a seldom-mentioned concept in the field of language testing. This work will allow for a stronger focus on test preparation activities and the extent to which learning can be impacted by aspects of a test. The information gained from the Chinese students about their beliefs, attitudes, approaches and strategies will help future students to understand their process of test preparation more fully and enhance their awareness of the test-taking process. The insights gathered will also provide additional insights into the experiences of Chinese students taking IELTS.

Literature review

High-stakes tests

The term ‘high-stakes test’ means that the results of such a test can profoundly influence the future progress and opportunities of the test-takers. Poza and Shannon (2020) and Cheng (2005) state that the results of tests (usually public examinations or large-scale standardised tests) are used as the basis of making important decisions for test-takers’ future academic and employment opportunities, which can affect the test-takers profoundly. Noori and Mirhosseini (2021) and Stenlund et al. (2017) share this view, stating that the outcomes of high-stakes tests can lead to major consequences for participants; high-stakes test such as IELTS can influence millions of test-takers (Clark et al., 2021). There are many aspects to the influence of such high-stakes tests: the greater the test’s consequences, the likelier it is to exert an effect on teaching and learning because a high-stakes test serves as an incentive to promote students’ performance (Estaji & Ghiasvand, 2021); Stenlund et al. (2017) argue that those taking such a test have a high motivation level because of the anticipated reward.

Test predictability

In recent years, test predictability has been scrutinised in the educational assessment field. There have often been public concerns expressed, through the media, about the predictability of high-stakes examinations, such as college entrance exams (Baird et al., 2014). Despite many suggestions across differing contexts and geographical areas about exams being overly predictable (Elwood and Murphy, 2015), little empirical work has been carried out on predictability in the language testing field. A test’s predictability can generate effects that are either beneficial or problematical. For instance, test transparency can generate effects that are positive but overly predictable tests can have negative impacts on students’ learning.

Predictable tests can be troublesome in that they have the potential of leading to negative outcomes. For example, both students and teachers can anticipate the question formats, the level of performance required, the conditions of taking the test, the topics and the scoring. A narrowing of the curriculum being taught, learning by rote, a focus on test-related content and failing to gain a deep understanding of certain subjects are among the negative aspects of an overly predictable test (Elwood and Murphy, 2015). Consequently, a test that is overly predictable can fail in its purpose of evaluating the intended knowledge. For instance, the breadth of students’ learning can be limited, as they will study only the exam-related topics.

There can also be a reduction in the depth of learning when students have advanced knowledge of the exam requirements, enabling them to achieve high scores without completely understanding the test content. They will, for instance, use prepared responses derived from tutorial classes to gain higher marks. This correlates with the claim by Hu and Trenkic (2021) that students’ performances in tests can be raised by private tutoring without any advancement in their knowledge and skills. This is a truth reflected also in the field of language testing where high scores might be achieved on a language test; yet those achieving them go on to perform poorly in their subsequent academic courses. As a result, when the strategies and techniques around tests allow test-takers to achieve results greater than deserved in terms of their actual language ability, the test scores’ reliability is put in doubt (Hu & Trenkic, 2021).

However, tests being predictable is not all about the negative effects. As Baird et al. (2014) have argued, such ‘positive predictability’ means that tests, if they are designed well, can improve learning and teaching, and can give motivation and useful information to students and teachers. Hu and Trenkic (2021) and Xie and Andrews (2013), for instance, found that the exposure of students to a test followed by tutoring in that test can have a positive impact on test scores. This might be because students’ test-taking skills will be improved because teachers will use materials like past papers, marking criteria and model answers to prepare them (Chong & Ye, 2021). By such preparation, students would feel less anxious when taking exams because there are fewer surprises awaiting them in the exam room. Moreover, those who have previous experience in a particular test have performance advantages in terms of eliminating the effect of unfamiliarity (Hu & Trenkic, 2021).

In terms of IELTS, Winke and Lim (2014) claim that it is beneficial to take the test once as a practice. Familiarisation with the format of the test is, in their view, a crucial step in test preparation. Despite several pieces of research making the case for predictability’s positive effects, some studies have raised doubts, with the effects argued as being less than obvious. One study on IELTS preparation, for example, found no difference in results between those who have taken a preparation class and those who have not (Gan, 2009).

Baird et al.’s framework of predictability

Baird et al. (2014) claimed that predictability is not a technical assessment term, yet it remains as important as such well-established assessment terms as fairness and validity. Their study looked into the views of students and teachers. In the present study, the focus will be solely on the elements of IELTS that are predictable, in relation to Baird et al.’s (2014) framework. We investigate how these features—test format, scoring, performance format, test conditions, support materials—can lead to potential impacts on the test-takers. Table 1’s elements serve as the framework used to present our data. Test format is a predictable feature when the nature of assessment, the weighting of question format and test components and particular topic areas are known beforehand. It can lead to potential impacts on those being taught on a test-wise basis. Scoring is a predictable feature if the manner in which performances are credited is foreknown; this can create potential impacts on those students whose only focus is on scoring criteria rather than the syllabus content in general. Performance format is a predictable feature if the manner in which test-takers have to respond to is known in advance; such knowledge can enable teachers to provide students with instruction on how to deliver the necessary sort of performances. Test conditions and exam support materials can also be predictable. If test conditions are known in advance, then test performances can be practised. In a similar vein, exam support materials can contribute to predictability of a test when model answers, past papers and other such publicly disseminated information sources are accessible and employed as test-preparing materials.

Table 1 Elements of test predictability based on Baird et al. (2014)

Methodology

Research questions

The focus of the study is on students’ perceptions of IELTS and their views on test preparation strategies from taking IELTS. Given this focus and context, the following research question emerged: What strategies do students employ to be successful in IELTS?

Research setting and participants

The setting is a Sino–UK programme within a Chinese university campus. Those students who complete the 4-year programme will be rewarded with graduation certificates from both the Chinese and UK universities involved. General syllabuses are jointly designed by both institutions, with all of the specialised courses being delivered in English. In order to be registered successfully for Year 2, an overall score of 5.5 is required in IELTS (representing a level of English between modest and competent) at the end of Year 1. At the end of Year 2, an overall IELTS score of 6.0 (competent level) will be required. No student with an overall IELTS score below 6.0 will be able to participate in courses in an EMI environment in Year 3. Consequently, it can be seen that IELTS, in this joint programme context, has a robust gatekeeping role and exerts a significant level of impact on the learning experience of Chinese students. Despite the test’s popularity and its influence worldwide, very few studies have looked in detail at IELTS within such a China–UK joint-institution programme in China, or on the experience of IELTS test-takers in such a context.

The focus of this research is on Years 3 and 4 students, as these are the groups that have both experience in taking IELTS and in attending EMI courses. One hundred one students completed the online survey; additionally, eight group interviews, involving 53 students, took place. The original survey’s response rate was 54.5%. Each of the focus group interviews comprised mixed student groups (with both male and female participants). Demographics of those involved are illustrated in Tables 2 and 3.

Table 2 Participants in the questionnaire
Table 3 Participants in group interviews

Research process

The research design has two strands, running concurrently. Strand 1 comprises an online survey, which was made available to students to gather their general views and opinions. Prior to officially sending out the online survey, three copies of it were sent to three teachers in the joint college, for the purpose of piloting. Their advice and suggestions were sought, and as a result, several non-essential questions were dropped. In strand 2, eight group interviews, each lasting between 40 and 60 min, were held, for the purpose of gathering richer data through engagement with students. The questions used in the group interviews derived from aspects of the work by Elwood et al. (2017) and Baird et al. (2014) that looked at test predictability, and that relate to matters pertaining to test preparation activities, along with students’ preparations for and perceptions of IELTS. Interview questioning adopted a semi-structured approach, in order to enable the interviewer (the first author) to keep focused while also allowing participants more freedom to speak up on issues important to them (Birmingham & Wilkinson, 2003). The questions in the interviews were designed to be open-ended and to obtain detailed answers from students.

Transcriptions from the qualitative data were completed immediately after the interviews were conducted by the first author. The data were initially transcribed in Chinese and then an English translation was completed. Thematic analysis was used to analyse the interview transcripts (Barnes et al., 2000). The six steps detailed by Nowell et al. (2017) were followed to process the thematic analysis and to ensure its trustworthiness: data familiarisation, the development of initial codes, theme identification, theme review, theme definition and naming and theme reporting (p. 4).

The quantitative data from the online survey were investigated initially using frequencies so that an overview of the responses could be gained. To review the relationship and statistical significance (p value) between the variables, Chi-square tests were utilised; for example, when p value is calculated at < 0.05, the relationship between the variables is reported to be significant; when p value is calculated at > 0.05, the relationship between the variables is reported to be not significant. Because the research follows a mixed-methods approach, the quantitative data will initially be presented followed by the data from the qualitative interviews, which will be utilised to gain an understanding of the insights and explanations brought by the former (Creswell, 2017).

Results

Test conditions and support materials

Test conditions and support materials are the key elements in Baird et al.’s (2014) framework. In reporting the data, this sub-section is divided into two aspects: test preparation materials and students’ other test preparation activities, including IELTS training courses and other test-preparation activities.

Materials used in test preparation

The data from both the group interviews and the online survey indicated a significant use of past papers among students in their IELTS preparation.

The survey data (Table 4) shows that 84% of students use IELTS preparation material in their preparations (a combination of always use and sometimes use). Table 5 reveals that 85% stated that the IELTS preparation materials were useful. This shows that student participants think that practising with test preparation materials can be beneficial because they can help students identify what is expected from their answers; also, students can become familiar with the formats and structures of questions. In the group interviews, students stated that the Cambridge English IELTS Academic: Authentic Examination Papers (a collection of past papers) was the IELTS training textbook that they mostly used. Students found this textbook especially useful because it is an official IELTS publication, from Cambridge English Language Assessment in collaboration with Cambridge University Press:

If you use other books, they won’t have the same good effect as the Cambridge English IELTS Academic (authentic examination papers) (G7S1).

Table 4 Would you use test preparation material to prepare for IELTS?
Table 5 Do you think that using IELTS preparation materials is useful to you in IELTS?

The Cambridge English IELTS Academic textbook was also used in tandem with online materials including Jijing—the latter being IELTS test items that have been memorised by Chinese students and shared online. It is, in effect, an online question bank of very recent past papers. Some students have reported that they have accessed Jijing through online discussion forums:

People will also share their tests’ questions and their answers through the online forums, as well as the information regarding the speaking test examiners in their area, so, you can see which topics are the most frequently tested ones and know the character of the test examiners in your area beforehand (G1S5).

However, some participants in this study suggested that Jijing is not very useful. One student, G7S3, stated that although he studied all of the IELTS questions in the online question bank, he never encountered those questions in the real test—although he conceded that he may have been ‘unlucky’. Other students purchased predicted test answers directly from online sources, while remaining aware that such answers needed to be used with caution, as they may result in low marks from examiners:

You can buy the predicted answers for the next month’s test just for 25 yuan (£2.50) online, but I don’t really trust the answers they give me, and in addition to that, if you memorise a speaking topic and try to recite it in front of the examiner, they would know and still give you a low score (G8S2).

In order to establish if a relationship exists between the two variables: students’ highest IELTS scores (Table 6)and the use of IELTS preparation materials, a Chi-square test was conducted to analyse the survey data.

Table 6 The highest IELTS score students achieved

A significant association was revealed between those using such preparation materials and the highest scores achieved in IELTS: x2 = 47.08, p < 0.05. From the data above, it is clear that over 80% of the students surveyed are using authentic test preparation materials—whether via online courses, textbooks or crowdsourced content like Jijing—and that they report these as useful.

Other test preparation activities

It was indicated by students that they can participate in IELTS test-training classes in one of two ways: (1) by attending such classes as provided by the joint college; (2) through private training agencies. It was stated by students that the IELTS preparation classes are used to improve their familiarity with the test (achieving test-wiseness) and to increase their test scores.

The joint college provides an IELTS training course for all first-year students. From Table 7, we can see that by adding the percentage of totally agree and agree, 58% of students reported that the IELTS training course provided by the joint college can help them improve their IELTS score. Student G1S1 stated that ‘after I entered the college, on my first year at the IELTS preparation English class, our reading teacher taught me a lot of techniques, which helped me answer many unsure test items of IELTS ’. The course is really helpful for the first-year students who are new to IELTS: ‘I learned the IELTS techniques from the first year IELTS English preparation class. I found what teachers taught was quite useful, especially for us who know nothing about IELTS. I think the test-taking techniques can improve our IELTS scores’ (G1S4).

Table 7 The IELTS training courses provided by the joint college can help to improve the IELTS score

Although over half of the survey participants suggested that participating in the joint college’s IELTS training course can aid them in improving their IELTS scores, Table 8 reveals that over half of the students stated a preference for the services of private training agencies, suggesting that they believe this route to be more efficient towards improving their IELTS scores.

I took IELTS training at private agencies. I think those teachers can teach you the test-taking skills and I think they are helpful, if you want to finish the IELTS test items in a limited time frame and keep a certain level of accuracy. (G1S6).

Table 8 Will you participate in private IELTS training agencies?

Some students suggested that the private training course is better than the course provided by the college: ‘The teachers from the training agency taught better than the teachers from our college; they are professionals in this area. Our teacherslevel is lower compared to them’ (G7S7). One of the main reasons, from students’ experience, is that the IELTS test item bank changes every 4 months, and teachers from private training agencies are gathering these items and using them within preparation classes. This narrowing focus on actual test items allows students to encounter the same test items that might appear in their own version of the test.

My teachers from the private IELTS training agency told us that in August the writing topics will be education and government, [so] we only need to focus on these two. In addition, I was astonished about the teacher who taught us IELTS listening in that private training agency—once, the teacher made a correct bet on all four sections of the listening questions. I don’t know how the teacher did that! (G5S7)

In order to attract students to study with them, some private training schools advocate that they can make accurate predictions on exam topics. This is sometimes achieved through means which can be referred to as shady, for example, asking test-takers who just finish IELTs outside the test site for information about topics included in the IELTS papers.

Some private agencies—maybe due to the purpose of attracting students—will send someone to wait outside each examination site, asking information from students who just finished the test and telling other students which topics have been tested, so you don’t need to prepare such topics anymore today. (G5S7).

In addition to private training schools, some students reported that they take online training courses, in order to acquire the test-taking skills for each part of IELTS. These students often find the online IELTS preparation courses conducive to test preparation because the materials, including recorded lessons, are saved online which enable them to review them anytime and anywhere they want.

I acquired the test-taking skills by taking the online courses. After class, we will head back to the dorm and watch them. I learned most of the test-taking skills from the online course (G1S2).

In order to test whether any relationship exists between the variables of the highest IELTS scores achieved by students and whether students have mastered the test-taking skills gleaned from the training courses, we used the Chi-square test. It transpires that the two variables have a significant relationship: x2 = 10.35, p < 0.05. It suggests that there is a positive relationship between participants’ (self-reported) IELTS scores and the effectiveness of the above training courses.

Despite 57.4% of students suggesting that taking the training course is helpful, and 51.5% of students having joined a private training course, not all of the students considered the training both in and outside the college to be helpful. These students hold a reserved view towards the test-taking tips and strategies prepared by their teachers. They question the accuracy of these tips and strategies because the teachers are not IELTS examiners and thus do not have insider’s knowledge about IELTS.

All the test-taking techniques taught by teachers were summarised by themselves, which means they may not work for other people … so you need to summarise a set of test-taking skills by yourself. Anyway, I used the test-taking skills from teachers and got a very low score, then I stopped using the test-taking strategies from teachers (G6S2).

Additional test preparation activities of students: mock exams

In the context of this research, mock exams and authentic exams were also used by students to improve their test familiarity, to practise time management, and to reduce the pressure and anxiety they felt. There are two forms of mock exams used by the joint college in this study: one is that used by Baird et al. (2014), wherein the mock exam provided by the joint college is used by students as preparation for IELTS; the other involves students taking the authentic IELTS on several occasions—effectively using the authentic exam as a series of ‘mocks’.

However, some students claimed that the mock exam offered by the joint college were not helpful. It is because the mock exam is not as authentic as the real IELTS in many aspects, including question types, topics and pressure faced by students.

It doesn’t work. I think we need to take the real IELTS as a practice because you will feel different pressure when you are facing real examiners and questions. Because the mock examination provided by the college is formed by the past test questions, those students who have practised many past questions will gain high scores easily. In addition to that, unlike in the real IELTS , there is a pressure of spending 2,000 yuan (£200)Footnote 1 in fees, but the mock exam is free of charge. So, there is a big gap between the mock examination and the real IELTS =(G2S6).

It was claimed by students that taking the authentic IELTS helps them to improve their test performance. This belief leads to multiple attempts at the test; indeed, 33% of participants stated that they had taken it four times or more (see Table 9). In the interviews, students G1S6 and G1S2 stated that they took IELTS on four occasions, while G7S2 and G8S3 revealed that they had taken it on seven and eight occasions, respectively.

Table 9 Number of times students take IELTS

Students reported that it was worth spending the money involved to gain an authentic test experience because it helps students get familiar with the administrative procedure of the test. This is useful to students because on the day of the actual examination, pressure associated with test procedure would be alleviated. This allows students to focus on responding to test items more effectively.

I believe that instead of using the time you spend at home to practise the past papers, you can benefit more from attending the real IELTS . By attending the real IELTS , you can get familiar with the whole test procedure. In this way, your score of the test will improve naturally (G8S1).

Test and performance format

The description of test and performance format in the framework proposed by Baird et al. (2014) means that the language skills in focus, format of test items and specific topic areas are foreknown. This section will look at the predictability of the test and performance format, and the impact they can bring to students’ test preparation.

It was stated by students that parts of IELTS can be predicted. The speaking section was regarded as the most predictable section, partly because its topics change only once every 4 months, as G1S4 noted, but also because the number of topics for which students can prepare is limited:

I focus 100% on the speaking part, because for the speaking part, there are. only 40 topics so you can remember them all (G2S7).

IELTS only changes its test questions three times a year, so if you have the old speaking topics and attend the test before it changes to new questions, the topics can at least be limited to 30 topics (G8S4).

In addition to that, students can also practice their test performance before the speaking test:

The speaking score can be easily and quickly improved. The person only needs to rote-learn the predicted speaking materials well enough and perform it in front of the examiners for around ten minutes (G2S5).

It is also the case that the high predictability of the test and performance format can have an impact on the mode of study used by students. Table 10 shows that independent study is used by almost two-thirds of students.

Table 10 Interaction with teachers and classmates

The reason for this is that students do not think it is helpful to have interaction with teachers and fellow students to improve their testing skills. To them, the key to success in IELTS speaking is to have ample opportunities to practise the target language. Therefore, it makes sense to the students to practise speaking frequently on their own.

I mainly interact with myself to study IELTS because I think it is useless to find a partner to practice. When speaking with a partner, we won’t really listen to each other; it works better if I practice by myself and speak more. The teacher may give you some testing skills, but it won’t work if you don’t practice by yourself (G6S2).

Some students concurred with this idea that practising with one’s classmates is useless, since their English level is the same. These students do not believe in the value of learning with peers, for example engaging in peer assessment, because their peers face the same difficulties as them and they are dubious about the fact that they would learn something new from their peers.

I don’t like to practise English speaking with my classmates because we are at a similar level and we all have an accent; practising with each other may only make it worse. (G8S4).

However, not all the students reported that studying with other people is useless. Student G3S4 preferred to interact with teachers because teachers had more experience and can tell students how to study. Student G3S1 agreed with this: ‘I think teachers have more experience, so they can point out which parts we would make mistakes in, and they can help us gain some confidence before the test.’ Other students expressed the view that interaction with classmates is helpful, especially on IELTS speaking: ‘I will practise oral English with my classmates for the IELTS speaking part’ (G1S5). Student G6S5 formed a partnership with another student: ‘We arranged to take part in IELTS at the same time, so we can practise IELTS English speaking with each other every night before the test.’

In order to test whether there was a relationship between the highest IELTS score students achieved and students’ interaction behaviours, we used the Chi-square test. It was found that these variables had no significant relationship: x2 = 11.209, p > 0.05. This would appear to show that no significant relationship exists between the students’ interaction behaviours and the highest IELTS score achieved. The preferences of students in terms of their interactions during their period of study have no effect on their test score.

Scope of language skills

The approach of ‘teaching to the test’ by tutors both inside and outside of the joint college had the effect on students narrowing the focus of their study of English. Focusing largely or exclusively on IELTS means that students limit their focus on particular English skills with a view to succeed in the test. However, in turn, this approach limits their language learning experience. In Table 11, we see that 36% (agree and totally agree) report that IELTS has had the effect of limiting their English learning scope.

Table 11 IELTS test has limited my English learning scope

Students narrowed the focus of their English study on certain skills as a strategy in order to gain an overall high score within a certain time period. During the group interviews, most of the students said that they focused their study on IELTS speaking, as they reported that the scores of speaking can be improved more easily than others:

I mainly focus on the speaking part. Even though my listening and speaking part are both not very good, the former is hard to improve. But the speaking part won’t change too much until every three months (G1S4).

Some students reported that the pressure can be reduced by fully preparing for the speaking section: ‘I focus on the speaking part, because I was under a lot of pressure when taking the speaking part, so by preparing for it, it can help me reduce the pressure.’(G6S3) Another reason for some students to focus on the speaking part of the test is that they tend to achieve a relatively lower IELTS score in this section than in other sections. Students G3S1 and G3S3 underscored the above by saying that they are focusing on the speaking section because this is their area of weakness. Student G1S6 expressed the same opinion:

My focus is also on the speaking part because I am not good at English speaking. Every time I take IELTS , I will look at previous topics to make sure I can get a good mark (G6S2).

However, not all the students focused on the speaking section of the test. Students also report focusing on the other sections such as writing, listening and reading. Such decision was based on two reasons. First, students made the decision to focus on certain papers of IELTS based on their self-assessment of their strengths and weaknesses. Students tended to devote most of their test-preparation time to prepare for the paper they are the weakest in. Alternatively, some more test strategic and pragmatic students focused on papers which they believed they had the most chance in performing well.

I mainly focus on writing because my score on the writing part is not stable (G1S7).

I mainly focus on the listening and reading parts because taking =IELTS in mainland China, they won’t give you a high score on the writing and speaking parts, so I put all my effort into the listening and reading part instead, to get my overall score higher (G5S7).

Discussion

This study has demonstrated the perspectives of students on IELTS predictability and the test predictability framework proposed by Baird et al. (2014) (Table 1) is used as the conceptual framework to guide our analysis. Baird’s et al.’s (2014) framework focuses on five areas of test predictability which we apply to our analysis on Chinese EFL learners’ IELTS preparation: test format, performance format, examination support materials, test conditions and scope of language skills. Regarding test format and test condition, the study found that students increased their familiarity with the format and administration procedure of IELTS by taking mock examination provided by their college and authentic IELTS multiple times. To the students, attending authentic IELTS is especially effective in gaining first-hand experience of how the test is administered. Having experience in the way IELTS is administered is important to some student participants because it can make them concentrate more on responding to test items instead of handling unexpected administrative procedures of the test. This finding supports the argument made by Winke and Lim (2014) regarding familiarisation of test procedure. As for test format, students find practising past papers and attending preparation courses by private tutorial schools helpful. Especially, students claimed that they can obtain accurate information about test items such as topics to be tested from their teachers in private tutorial schools because these tutors can obtain the latest information about IELTS from the feedback provided by other learners who just attended the test.

As for performance format, the study found that most participants join test preparation courses, wherein scripted answers are provided to students for them to memorise. In particular, private training agencies play an essential role in student test preparation activities. In this research, students were signing up to IELTS preparation classes in order to access test-taking skills or predicted answers, with these classes being seen by them as a short cut to enhancing their IELTS scores within a brief period of time. The work of Elwood et al. (2017) supports this finding; they found that their research participants were also obtaining support from private agencies, and their participants reported that in these agencies they were being given extra help to gain additional marks.

As far as support materials are concerned, students in this study mainly rely on two types of test preparation materials: an official textbook and online discussion forum. The student participants believe in the textbook used in their preparation course in the college because it is developed by the official test providers of IELTS. Moreover, students obtain additional materials about IELTS through an online platform where IELTS test-takers shared test items which they memorised based on their recent test-taking experience. Additionally, some learners find the materials of online IELTS preparation courses useful because they can review the materials at their fingertips. In sum, IELTS test-takers in this study seem to draw on both formal and nonformal learning materials to prepare for their test, reflecting on the notion of language learning beyond the classroom to the context of language testing and test preparation (Reinders & Benson, 2017).

In terms of scope of language skills, students focused on certain skills in their English study as a strategy to achieve an overall high score and within a particular time period. In the group interviews, a majority of the students stated that their study was focused on the speaking section of IELTS; their belief was that this section’s scores can be improved with greater ease than other sections of the test. Students also selected their focus of study based on two criteria: their own reflections of their weaknesses and the practical consideration of the section which it is the most likely for them to improve their performance in a short period of time.

Implications and conclusion

As this study suggests, students seem to be influenced significantly by the belief that attending preparation classes, within a compressed time period, and accessing their materials will yield higher scores—even though such activities may also have the effect of hindering their full English language skills development. Additionally, some private training agencies and individuals who sell predicted answers are involved in alleged unethical activities. Such test techniques can assist students in attaining a score greater than their language ability may warrant (Hu & Trenkic, 2021), but the problem remains that students using these techniques still lack the ability to do well in the future (Inoue et al., 2021). Differing from other studies, in the present study’s research context, students would acquire the predicted answers either from private training schools or using online resources, or both. The result of this is that IELTS is made more predictable and also brings negative effects in reducing the students’ depth of learning of English, as their high scores will be achieved by rote learning and prepared answers. This may be among the reasons why IELTS is less able to accurately predict students’ language competency in their subsequent academic careers. Consequently, although institutions prefer using exam scores as a short-cut mechanism for the admission of students (Pearson, 2019), our data suggests that they should be wary when making decisions based on the IELTS scores alone.

Given the critical role of tests in a test-driven educational system such as China, it is easy to imagine how greatly students’ lives may be influenced by exams. This study’s findings have shown that clear goals in language learning may have directed the test activity of students towards outcomes that reflect more closely the learner’s requirements and motives.

From our findings, we would argue that the joint colleges should be made aware of this situation, and as discussed above, try to explain that the most useful way of improving the test score is through academic English combined with test preparation and provide such course to students, which is supported by study conducted by Johnson and Tweedie (2021). How to counteract the activities from private training schools deserves future research to explore. The significant impact of IELTS on students within our research context cannot be ignored. While the aims of attending to IELTS is to generate positive learning outcomes and prepare students for the EMI modules, the reality is that students take passing the test as the main goal of their education. This paper shows that neglecting students’ reactions and values on such reality means we miss valuable insight into how issues in testing can influence students. As Elwood and Murphy (2015) advises, in a high-stakes testing context, the test impact on students should be considered the primary priority. Thus, it would be beneficial to collect students’ voices and re-evaluate our educational decisions, including the use of international English tests like IELTS as the sole indicator of college students’ English proficiency level.

Limitations and suggestions for future research

A mixed method was used in the study to gain a comprehensive picture by way of triangulating the results of different approaches. Nevertheless, as with all research, limitations in terms of scope, transferability and generalisation remain. The main limitation, perhaps, is the scope; because of time constraints, the study was carried out with only a limited number of participants. Data were only gathered from one joint college in China. On the other hand, the response rate overall was reasonably good for a study of this nature. It would be better in future studies to collect the data from a range of joint colleges in order to include a larger population to unravel deeper and more complex perceptions of IELTS test-takers with different backgrounds and English proficiency levels. Given that the research site in this study is a medical university, a different result might be found if the sites of research were in general universities. Consequently, the results in this study may not be wholly representative of a wider test-taking population. One should, therefore, remain wary about generalising its findings to other educational contexts. In order to better explore the washback effects of IELTS in this context, future researchers may conduct a longitudinal study to explore the complexity of the washback effects of IELTS in this context.

Availability of data and materials

Data will not be shared because of confidentiality agreement with participants.

Notes

  1. At time of writing £1 = 9 yuan.

Abbreviations

IELTS:

International English Language Testing System

References

  • Baird, J., Hopfenbeck, T. N., Elwood, J., Caro, D., & Ahmed, A. (2014). Predictability in the Irish leaving certificate. OUCEA Report, 14(1), 1–111.

    Google Scholar 

  • Barnes, M., Clarke, D., & Stephens, M. (2000). Assessment: The engine of systemic.

    Google Scholar 

  • Birmingham, P., & Wilkinson, D. (2003). Using Research Instruments: A Guide for Researchers (1st ed.). Routledge.

  • Cheng, L. (2005). Changing language teaching through language testing: A washback study (Vol. 21). Thousand Oaks: Cambridge University Press

  • Chong, S. W., & Ye, X. (2021). Developing writing skills for IELTS: A research-based approach. New York, NY: Routledge.

  • Clark, T., Spiby, R., & Tasviri, R. (2021). Crisis, collaboration, recovery: IELTS and COVID-19. Language Assessment Quarterly, 18(1), 17–25. https://doi.org/10.1080/15434303.2020.1866575.

    Article  Google Scholar 

  • Creswell, J. W., & Creswell, J. D. (2017). Research design: Qualitative, quantitative, and mixed methods approaches. Thousand Oaks: Sage.

  • Dang, C. N., & Dang, T. N. Y. (2021). The predictive validity of the IELTS test and contribution of IELTS preparation courses to international students’ subsequent academic study: Insights from Vietnamese international students in the UK. RELC Journal. https://doi.org/10.1177/0033688220985533

  • Dong, L. (2019). A study of IELTS's affective washback on Chinese students' learning goal, motivation, and anxiety. Language Teaching Research Quarterly, 9, 122.

    Google Scholar 

  • Elwood, J., Hopfenbeck, T., & Baird, J.-A. (2017). Predictability in high-stakes examinations: Students’ perspectives on a perennial assessment dilemma. Research Papers in Education, 32(1), 1–17. https://doi.org/10.1080/02671522.2015.1086015.

    Article  Google Scholar 

  • Elwood, J., & Murphy, P. (2015). Assessment systems as cultural scripts: A sociocultural theoretical lens on assessment practice and products. Assessment in Education, 22(2), 182–192. https://doi.org/10.1080/0969594X.2015.1021568.

    Article  Google Scholar 

  • Estaji, M., & Ghiasvand, F. (2021). IELTS Washback effect and instructional planning: The role of IELTS-related experiences of Iranian EFL teachers. Journal of English Language Teaching and Learning, 13(27), 163–192.

    Google Scholar 

  • Gan, Z. (2009). ‘Asian learners’ re-examined: An empirical study of language learning attitudes, strategies and motivation among mainland Chinese and Hong Kong students. Journal of Multilingual and Multicultural Development, 30(1), 41–58. https://doi.org/10.1080/01434630802307890.

    Article  Google Scholar 

  • Hu, R., & Trenkic, D. (2021). The effects of coaching and repeated test-taking on Chinese candidates’ IELTS scores, their English proficiency, and subsequent academic achievement. International Journal of Bilingual Education and Bilingualism, 24(10), 1486–1501. https://doi.org/10.1080/13670050.2019.1691498.

    Article  Google Scholar 

  • Inoue, C., Khabbazbashi, N., Lam, D. M., & Nakatsuhara, F. (2021). Towards new avenues for the IELTS Speaking Test: insights from examiners’ voices. IELTS Partners.

    Google Scholar 

  • Johnson, R. C., & Tweedie, M. G. (2021). “IELTS-out/TOEFL-out”: Is the End of general English for academic purposes near? Tertiary student achievement across standardized tests and general EAP. Interchange, 52(1), 101–113. https://doi.org/10.1007/s10780-021-09416-6.

    Article  Google Scholar 

  • Noori, M., & Mirhosseini, S. A. (2021). Testing language, but what?: Examining the carrier content of ielts preparation materials from a critical perspective. Language Assessment Quarterly, 18(4), 1–16. https://doi.org/10.1080/15434303.2021.1883618.

    Article  Google Scholar 

  • Nowell, L. S., Norris, J. M., White, D. E., & Moules, N. J. (2017). Thematic analysis: Striving to meet the trustworthiness criteria. International journal of qualitative methods, 16(1), 160940691773384. https://doi.org/10.1177/1609406917733847.

    Article  Google Scholar 

  • Pearson, W. S. (2019). Critical perspectives on the IELTS test. ELT Journal, 73(2), 197–206. https://doi.org/10.1093/elt/ccz006.

    Article  Google Scholar 

  • Pearson, W. S. (2020). Mapping English language proficiency cut-off scores and pre-sessional EAP programmes in UK higher education. Journal of English for Academic Purposes, 45, 100866. https://doi.org/10.1016/j.jeap.2020.100866.

    Article  Google Scholar 

  • Pearson, W. S. (2021). The predictive validity of the Academic IELTS test: A methodological synthesis. ITL-International Journal of Applied Linguistics, 172(1), 85–120. https://doi.org/10.1075/itl.19021.pea.

    Article  Google Scholar 

  • Poza, L. E., & Shannon, S. M. (2020). Where language is beside the point: English language testing for Mexicano students in the Southwestern United States. The sociopolitics of English language testing, 46. https://doi.org/10.5040/9781350071377.0008.

  • Reinders, H., & Benson, P. (2017). Research agenda: Language learning beyond the classroom. Language Teaching, 50(4), 561–578. https://doi.org/10.1017/S0261444817000192.

    Article  Google Scholar 

  • Sari, N. A., & Mualimin, M. (2021). The influence of the pandemic on the motivation of EAP learners in studying IELTS. In E3S Web of Conferences (Vol. 317, p. 02031). EDP Sciences.

  • Stenlund, T., Eklöf, H., & Lyrén, P. E. (2017). Group differences in test-taking behaviour: An example from a high-stakes testing program. Assessment in Education: Principles, Policy & Practice, 24(1), 4–20. https://doi.org/10.1080/0969594X.2016.1142935.

    Article  Google Scholar 

  • Tsagari, D. (2013). EFL students’ perceptions of assessment in higher education.

    Google Scholar 

  • Tsang, C. L., & Isaacs, T. (2021). Hong Kong secondary students’ perspectives on selecting test difficulty level and learner washback: Effects of a graded approach to assessment. Language Testing, 1–27. https://doi.org/10.1177/02655322211050600

  • Winke, P., & Lim, H. (2014). Effects of testwiseness and test-taking anxiety on L2 listening test performance: A visual (eye-tracking) and attentional investigation. IELTS Research Reports Online Series, 30.

  • Xie, Q., & Andrews, S. (2013). Do test design and uses influence test preparation?Testing a model of washback with Structural Equation Modeling. Language Testing, 30(1), 49–70.

    Article  Google Scholar 

Download references

Acknowledgements

The authors wish to thank Professor Jannette Elwood for the supervision of the first author’s doctoral study. This paper is based on the first author’s doctoral dissertation.

Funding

There is no funding for this research.

Authors’ information

Hui Ma is a lecturer at School of Foreign Languages, Southeast University, China. He received his Doctorate Degree at Queen’s University Belfast, and his research interests are language assessment, language teaching and learning.

Sin Wang Chong is a Lecturer (Assistant Professor) in TESOL at School of Social Sciences, Education and Social Work at Queen's University Belfast, an Affiliated Lecturer (Assistant Professor) in TESOL and Academic English at University of St. Andrews, and a Senior Fellow of Higher Education Academy. He is Associate Editor of two international journals: Innovation in Language Learning and Teaching, and Higher Education Research & Development.

Author information

Authors and Affiliations

Authors

Contributions

HM conceptualised the study, collected and analysed the data, and contributed to writing up of the manuscript. SWC reviewed the analysed data and contributed to writing up of the manuscript. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Sin Wang Chong.

Ethics declarations

Competing interests

The authors declare no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Ma, H., Chong, S.W. Predictability of IELTS in a high-stakes context: a mixed methods study of Chinese students’ perspectives on test preparation. Lang Test Asia 12, 2 (2022). https://doi.org/10.1186/s40468-021-00152-3

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s40468-021-00152-3