- Case study
- Open Access
IELTS economic washback: a case study on English major students at King Faisal University in Al-Hasa, Saudi Arabia
Language Testing in Asiavolume 8, Article number: 5 (2018)
The International English Language Testing System (IELTS) is a standardized high-stakes test, which is used for the purpose of measuring the language proficiency of candidates who wish to study or work in an environment where English is the language of communication (IELTS introduction. Retrieved from https://www.ielts.org/what-is-ielts/ielts-introduction. Last accessed 2 Nov 2017.). This study aims to explore the economic washback of the test, since it is an under-researched area in the literature.
A qualitative approach was adopted to conduct this study, and, therefore, data were collected from 94 participants through the means of semi-structured questionnaires, and semi-structured interviews were conducted for six participants. Data were analyzed inductively by employing a computer program called Nvivo.
Discussion and Evaluation
The findings have shown that the participants were financially affected by the test. In the questionnaires, the participants reported that the test was expensive, and its cost kept rising from time to time. Students who were most affected financially by the price of the test were those who had to repeat the test because they could not achieve the minimum requirements. Interview results revealed that all the students in the focus group repeated the test at least four times. In addition, most students explained that they were dependent on their family to pay for the cost of the test. Furthermore, some students had to travel to a different region or country, since the academic module was not offered in their test center. Finally, the interview results indicated that the preparation courses were an additional financial burden on these participants, especially because the participants were not sure when they would be ready to take the test and whether the training that they have received would be enough to help them achieve the required test score.
Apparently, the IELTS Test created a negative economic washback on these students. To lessen this financial effect, the participants of this study recommended that the test price should be reconsidered, the test should be modular and the best result in each module in any two or more IELTS Tests taken in the last two years should be accepted.
Tests do not operate in a vacuum. They are highly “embedded in educational, social, political and economic contexts” (Shohamy, 2001, p. xvi). Undoubtedly, context plays an important role in understanding tests. Macqueen, Pill, and Knoch (2016) explained that “the role of test impact research is to challenge the taken-for-grantedness of the standard by interrogating its rationale and examining its effects” (p. 273). Hence, this paper focuses on the economic washback of the IELTS test. It aims to explore the negative economic consequences that the examinees face. It should be noted that this paper does not intend to discuss the economic gains or the opportunities that will be achieved by the examinees, such as jobs, acceptance into a university, and ease of immigration, but rather, the study focuses on the financial burdens that the test imposes on the examinees. O’Sullivan (2011) notes that “Now testing is an industry…about a $2 Billion a year industry”. More specifically, the number of candidates appearing for the IELTS exam is increasing rapidly every year due to the industrialization of the test, being trusted and recognized around the world, and, most importantly, the high validity and reliability that the test pertains. O’Sullivan (2011) explained that “twenty years ago, the population of IELTS was less than 10,000 now it’s 1.5 million.” Not surprisingly, over 2.9 million people applied for the test in 2016 across over 140 countries (IELTS, 2017b). Apparently, the IELTS has gained precedence over other major tests in the industry, such as the TOEFL.
The examinees are considered to be a key contributor to the success of the test, especially in terms of its economic side. Shohamy (2001, p. 123) pointed out that “The high economic value placed in tests also explains why test takers are willing to spend big sums of money for test preparation courses and support a growing industry that just make the rich even richer.” This situation is applicable to the IELTS test, as students who wish, for instance, to gain acceptance into prestigious universities (where IELTS is one of the admission criteria) will be very keen to uncover the techniques required to succeed in the exam and optimize their test potential in order to achieve the required score. Such preparation is said to significantly increase the chances of those students in achieving the score that they want. However, the quality as well as the duration of the preparation for the test differs from one candidate to another. It depends mainly on the score that the candidate has achieved on the test initially or any other tests designed for the same purpose and the target score that the candidate aims to achieve. Sometimes, it takes several weeks, months, or even a couple of years in the worst scenarios for examinees to succeed in the test. Consequently, the longer the students prepare for the course and the more number of times they fail in the test, the higher the financial costs they will incur. This journey of preparation often leads to desirable outcomes, and sometimes, that is not so. In this case, candidates will be affected differently by the test.
Green (2007, p. 17) stated that “Participants set the test stakes according to their awareness (or lack of awareness) of the uses to be made of test results.” IELTS is a high-stakes test, and most students are aware of the consequences of failure in the test. Therefore, the examinees study hard for it. It follows then that the higher the stakes, the more effort that is exerted, and, sometimes, more money is spent on the test and the courses required for the preparation. Ingulsrud (1994) described the financial impact on students and their families while discussing the entrance examinations for the Japanese university as follows:
For students who are serious about entering a highly ranked university, a considerable amount of coaching is normal in preparing for the entrance examination...Supplemental education of this kind costs a good deal of money, and yet students and their families are willing to make such sacrifices. (p. 79–80)
The above discussion clearly shows that tests (especially high-stakes ones, such as the IELTS) are inevitably embedded in economic contexts. Hence, candidates need to comply with the economic demands (i.e., enrollment in costly preparation courses, unlimited and unwise repetition of the test, or buying specimen materials pertaining to the test) even if they know that they are being exploited. As a matter of fact, Spolsky (1995, p. 1–2) found that in the early history of the TOEFL test, there was a “tendency for economic and commercial and political ends to play such crucial roles that the assertion of authority and power becomes ultimately more important than issues of testing theory or technology.”
The term “washback,” in Applied Linguistics (see, e.g., Alderson and Wall, 1993; Bachman, 1990; Cheng, Watanabe, and Curtis, 2004; Davies, Brown, Edler, Hill, Lumley, and McNamara, 1999; Green, 2007, among others), and the term “backwash,” in education, (see, e.g., Broadfoot, 1996; Gipps, 1994; Hughes, 2003, among others) are used interchangeably to mean one thing, which is the influence of tests on instruction (Davies et al., 1999), or teaching (Alderson and Wall, 1993), and on learning (Hughes, 2003). For the purposes of this study, the term “washback” will be used. Though washback can be both positive and negative, this paper focuses on the negative washback (in terms of finances) of the IELTS test.
Previous studies pertaining to washback on the IELTS test focused on the impact of the test on professional associations (Merrifield, 2008, 2011, 2016), on education and society (Moore, Stroupe, and Mahony, 2012), on IELTS preparation courses (see, e.g., Allen, 2017; Brown, 1998; Chappell, Bodis, and Jackson, 2015; Green, 2007; McPherson, Chand, & Khan, 2003; Mickan and Motteram, 2008; Mickan and Motteram, 2009; Rao, Read & Hayes, 2003), on score gain (Allen, 2017; Craven, 2012; Elder and O’Loughlin, 2003; Green, 2007; Humphreys, Haugh, Fenton-Smith, Lobo, Michael, and Walkinshaw, 2012; O’Loughlin and Arkoudis, 2009), on teaching materials (Saville and Hawkey, 2004), on computer-based versus traditional hand-written form of IELTS (Weir, O’Sullivan, Yan, and Bax, 2007), on the affective and academic impact (Rea-Dickins, Kiely, and Yu, 2007), on candidate’s language (O’Sullivan and Lu, 2006), on test-taking purposes, i.e., immigration and secondary education (Merrylees, 2003), and on gender (O’Loughlin, 2000). However, none of these IELTS impact studies have talked exclusively about the economic washback of the test on the candidates. Instead, there was very little discussion regarding the financial burden of the test (see, e.g., Chappell et al., 2015; Merrifield, 2008; Moore et al., 2012). This study focuses mainly on the economic washback of the IELTS tests, and this is where it fills such gap in the literature. Based on the literatures previously discussed, one main research question and two sub-questions were addressed in this paper, which are given as follows:
The main question is:
How do Saudi students at King Faisal University view the economic washback of the IELTS test?
The sub-questions are:
What are the consequences of the financial burdens that the IELTS test has imposed on these candidates? Why and in what way?
How will they overcome or avoid such financial difficulties? And what role does their language proficiency play in that?
One hundred twenty students majoring in English (60 males and 60 females) from the Department of English Language at King Faisal University in Saudi Arabia were recruited for the study. Due to the nature of study, participants were selected purposively so as to allow the researcher to select the ones who were affected negatively by the test. Lincoln and Guba (1985, p. 40) emphasized that purposive sampling “increases the scope or range of data exposed as well as the likelihood that the full array of multiple realities will be uncovered.”
All the students who participated in the study had taken the IELTS test at least once in the last 2 years or so. Some of them (22 students) were in their third year, and the rest (72 students) were in their fourth year of academic study. The consent of the participants was taken prior to the study. Semi-structured questionnaires as well as semi-structured interviews were conducted for the purpose of data collection. The researcher took an initial approval from 120 students to take part in the study. Of these, 94 (53 females and 41 males) participants filled out the questionnaires (with a completion rate of 78%). After that, the researcher conducted semi-structured interviews with six students (three females and three males). These students sat for the IELTS test many times and had enrolled in preparation courses as well. The intention was to choose the ones who were mostly influenced by the test.
Materials and procedure employed
The study had been conducted at the Department of English Language at King Faisal University in Saudi Arabia at such times that were convenient to the students. The process of data collection lasted for 2 months. The semi-structured questionnaire was constructed in order to explore the views of the students regarding the negative washback of the test. Similarly, the interview had been conducted in order to obtain a detailed description of the students’ perspectives (those that are pertaining to negative washback) about the test and to verify those views that had been stated in the questionnaires. The questions that had been asked in the questionnaire as well as the interviews were carefully formulated and identified so as to include items that are closely related to the central aim of the study. Thus, the researcher was keen to include topics that were related only to the IELTS economic washback. This was done through the lengthy process of expert reviews and trials with a small sample of students. The questionnaire was filled online, while the interviews were conducted face-to-face with the students. The researcher interviewed the male students, while the female students were interviewed by a female (research assistant), who was trained beforehand and given the agenda prior to the interviews.
In this qualitative study, the data were analyzed inductively by following the method of content analysis employing themes as the unit of analysis. That is to say, the main ideas were categorized into themes for the ease of understanding. The themes in this study, which originated mainly from the questionnaires and the interviews conducted, were coded and analyzed through a computer program called Nvivo. Interviews were then transcribed by the researcher, and the interview scripts were given back to the interviewees for the purpose of validation.
Discussion and Evaluation
Time in which the test had been taken
All the students reported that they had taken the test within the last 2 years. So, they were within the “safe period” in which the test claims that their language proficiency would not suffer from attrition (Penn State University, 2017). Accordingly, when the test is repeated within this period of time, their language proficiency was highly unlikely to witness a significant improvement in terms of their scores.
Reason for choosing IELTS
Most of the students took the IELTS test for the sake of studying abroad or to get admission into a master’s program in Saudi Arabia. So, there was a genuine motivation behind taking the test other than simply determining their level of proficiency, which was indicated by very few students. The current situation in Saudi Arabia for those who wish to study abroad through the King Abdullah Scholarship Program necessitates them to provide evidence of their language proficiency to show that they can meet the language requirements of the intended program of study. In the past, students were sent to language centers (with the language component being the condition for entry into a program), and, in most cases, they needed to go through a year’s preparation program for English for academic purposes. Such measures were taken due to the prevalent global financial and economic crisis and, most importantly, due to students who spent sometimes more than 1 year in language centers. Therefore, asking the students to provide evidence of language proficiency and getting unconditional offers afterward has proved to be cost-effective and time-efficient for the government. That was the key role that the proficiency tests played in Saudi Arabia.
The students’ scores
The overall scores of the students in the IELTS test ranged from 5.5 to 7.5. The overall scores that were required for the postgraduate programs which most of the students wanted to join ranged from 6 to 7. The real challenge for the students was not acquiring the required overall score but getting the minimum requirements for band scores in each module (it ranged from 5 to 6 depending on the program of study).
The number of times the test had been taken
The majority of the students sat for the test more than once. Some of them took the test three to five times. Most of the students reported that the key reason why they kept repeating the test again and again was because they could not achieve the balance between the overall scores and the band scores. One of the students commented as follows:
It’s like hit and miss, if you get the overall score you are happy and feel satisfied but how about if you cannot get the band score, you will repeat the test and sometimes you fail in another module or you get the band score but you fail in the overall one. So, it’s a matter of luck.
It is for this reason that some of the students suggested that the IELTS test should be made modular, which would entail that students would only take the module in which they failed. This point will be further discussed in the interview.
The cost of the test
At the time when the data for this study was collected, which was in early 2017, the cost of the IELTS exam was 1000 Saudi Arabia Riyal (SAR) (approximately USD 266). As of 1 August 2017, the price of the test had changed to SAR 1050 (approximately USD 280) (British Council, 2017a). Most of the students indicated that the cost of the test, which kept rising frequently, had affected them financially. Some students said that they were keen on taking the next test as soon as possible in order to avoid any increase in the price. Other students said that they booked two or more tests at a time in order to take advantage of the old price. In fact, there is no available evidence, to the best knowledge of the researcher, that tracks the increase in the price over time which is available to the public. However, O’Sullivan (2011) explained that the price of the test is somehow unreasonable as he noted as follows:
The pricing it said, it should cost 10 pounds plus local costs. Now if you take our currency converters that you get them in banks, you can take 10 pounds in 1980 money and turn it into whatever it should be in this nowadays money, it’s 32 so 32 plus local costs. Maybe another 10 quid, so 42. So that’s where we think or I think the price should be around somewhere. Instead it’s 142, you take 142 that’s an exaggeration it varies from about 110 pounds to 140 pounds in various markets.
In addition, some students said that other services have also been affected by this change in prices, which include changing the date of the test, sending the results by post (locally and internationally), and getting an additional copy of the test report to be sent internationally. Not surprisingly, some students stated that they compared the test prices and found them to be cheaper in, for example, Bahrain, which is very close to the Eastern Province in Saudi Arabia. Currently, the price of the test is BHD 95 (approximately USD 250), while it is SAR 1050 (approximately USD 280) in Saudi Arabia (British Council, 2017a). These students reported having saved some money through this cheaper option, especially if they were booking two tests or more. Actually, what these students are saying, though it may not be generalized into other contexts, may perhaps contradict what the British Council website claims about the price of the test, which says that “IELTS has a set fee for its test” (British Council, 2017b).
The cost of IELTS preparation courses
Similarly, the preparation courses for the IELTS test and its materials were also considered to be an additional financial burden to the students. Almost all the students said that they had to enroll themselves in a preparation course in order to familiarize themselves with the test and acquaint themselves with the techniques for success in the test. Some said that they benefited from the free online preparation courses that the British Council offered in its website. Yet, they emphasized that the information in those courses was general, superficial, and less comprehensive than the costly ones that are available in the market. The price range for the preparation courses, according to the students, having an 8-week duration ranged from USD 300 to USD 400. Sometimes, students needed to stay longer in the course since they did not manage to attain the target score. Thus, they had to pay more. Alternatively, one student said that she wasted her time when she attended the preparation course, then she decided to prepare for the test by herself using preparation books and materials related to IELTS.
Right after the students had filled out the questionnaires, six students (three females and three males) were selected out of the whole group of 94 students (53 females and 41 males) in order to take part in a semi-structured interview. These students were selected from the main group because they had appeared for the IELTS test numerous times and had enrolled in preparation courses. All students who were interviewed said that they had taken the IELTS test solely because they wished to join a university course abroad. The intention of the study was to choose those who were mostly influenced by the test. The interviews will further investigate the financial burdens that the test imposed on these students.
The cost of the test and the students’ scores
As mentioned earlier in the analysis of the questionnaire data, students reported that they were affected financially due to the increase in price of the test. However, students in this focus group suffered more due to such an increase since all of them repeated the test a number of times (at least four times) and in different test centers. The average cost of the test for these students ranged from SAR 4000 (approximately USD 1064) to SAR 6000 (approximately USD 1596). The reasons for repeating the test were almost identical among these students. They stated that the main reason for repeating the test was either because they did not achieve the band score requirement, especially in writing, or the overall score of the test. Of the same importance, one of them mentioned that the institution that she applied to raised the overall score after she successfully managed to get the required old overall score in her second trial. She said as follows:
My university raised the overall score from 6.5 into 7…you know that half band score is so difficult to reach in IELTS and I remember I got 6.5 but when I applied they said no you can’t …your offer has expired and you need to apply again because our conditions have changed...the IELTS overall score is 7.
That student could not achieve the half band score even after having repeated the test twice. Eventually, she decided to give up and opt for another university which accepted “reasonable scores…it accepts 6.5.” She indicated that she was forced to take such a decision because of many reasons. Chief among them was the financial burden of the test, which according to her and other students, was preventing them from taking the risk again and lowering their expectations (i.e., to be content with what they have got). This seems to be actually consistent with the findings of Chappell et al. (2015), as they found that some participants “felt there was an underlying financial purpose, or incentive, for the owners of IELTS, especially since the requirement for re-taking the test has been eased, which is seen by P3 as replacing genuine care for test-takers with financial gain” (p. 12). Moreover, her situation, where she was refused entry into university, because of low proficiency, is consistent with what Shohamy (2001) argued about with respect to the cut-off scores. She said that “Cutting scores are often used by those in authority as ‘gate-keeping’ tools—barriers to those who are not wanted” (p. 38).
This participant also said that she had found herself to be in a hard place since the test date for the second test was almost due for 2 years, and her university (with the 6.5 option) would not accept expired results. Above all, she and other students explained that they were dependent on their family to pay for the cost of the test, and, thus, they were pushed to study harder each time. In fact, some of the students came from low-income families, yet their families kept paying money as they were keen that their children get accepted into prestigious universities. Likewise, Moore et al. (2012) noted as follows:
Parents and guardians of IELTS test-takers would seem to be a significant stakeholder group since, without their financial support and encouragement, their children would likely not be able to sit the Test, or to achieve a band score that could enable access to overseas scholarship opportunities.
Other students reported similar situations regarding the test. Some students said that the reason for them retaking the test was because of the fluctuations in their scores. That is to say, their scores tended to change each time they sat for the exam. For instance, if they successfully achieved the overall score, they would underperform in one band score, as they could not achieve the required minimum score in that module. Therefore, they would repeat the test in order to meet the requirements of both the scores. This was the biggest challenge that all the students faced in this study. Meeting the requirements of the test in terms of both scores caused a double jeopardy for the students. In fact, they were more concerned with the band score, since the overall band score could be compensated very easily by performing well in other modules. One student described this by saying as follows:
Sometimes you have a low score on writing and you want to take the test for a second time but you know what…The problem is that you will fail in another thing…I don’t believe the test is testing my skill as before because the content is different the expressions that I used could be different the marker of the test is different…so…so how do you want to keep the same score because there are other things which come into play when you repeat the test…anyway I lost my money.
This student seemed to be blaming the reliability of the test for losing his money, since he claimed that he was confident that his performance was good. This is consistent with the findings of Mickan and Motteram (2008), as they reported that “repeated test-taking and candidates’ experience of variations in scores” have “affected test-takers’ confidence in the Test” as “they expressed frustration with rating procedures” (p. 19). When the researcher of this study asked this participant about the reason why he did not complain to the test center about this problem, he and other students said that it would cost extra money to have their test papers remarked. The cost for remarking the test is currently SAR 600 (approximately USD 160). They would rather repeat the test and pay SAR 1000 (approximately USD 266), as they said that they had “little chances if no chances at all to get higher scores” than the ones that they got. The economic impact of the test on this student is somehow consistent with what Moore et al. (2012) found in their study. They said that “for those test-takers who do not succeed in achieving the required IELTS scores, the reality they face is one of uncertainty and financial stress (e.g., if they decide to continue studying and re-sit the Test)” (p. 47).
Accordingly, almost all students who were interviewed suggested that the IELTS exam should be made modular, i.e., students should be allowed to repeat only the module in which they failed. In fact, the IELTS test is not a modular test, as students cannot take the modules of the test separately, except for the speaking module. That is because “Performance in the four skill areas is combined to provide a maximally reliable composite assessment of a candidate’s overall language proficiency at a given point in time” (Penn State University, 2017). To overcome this problem, some students suggested that universities should allow students to provide them with two test reports, so that they can compare and judge their proficiency only on the module in which they failed in. That is to say, if a student failed in the writing module, and he or she had succeeded on the required overall score in test A, while he or she failed in the overall score of test B while succeeding in the writing module, the student should be accepted into university because his or her performance indicated that the minimum requirements were met in one test. This is apparently consistent with the findings of Merrifield’s study (2008), who found that.
Several associations in all three countries have resolved this issue by developing a policy that the best result in each macroskill in any two or more IELTS Tests taken in the last two years would be accepted. This does not affect the cost to the candidate of taking successive tests, but gives greater flexibility in the outcomes.
Similarly, in an attempt to lessen the washback effect of tests, Barkaoui (2017) pointed out that “Test users want to know when a candidate submits multiple test scores whether to use only one of them (e.g., the most recent one) or they can use information from multiple tests” (p. 429).
Additionally, some students mentioned that the test imposed some financial difficulties on them because they had to travel to another region or, sometimes, another country to meet with the nearest test date. They had to do this because they said the following:
The academic module is not available every week and there is a deadline for registration so, if you miss the deadline which is usually 3 or 4 days before the actual test date you have to wait until the next test date or find another test center and you know we have to give to the university as soon as we can.
Currently, the test is offered in ten regions in Saudi Arabia, namely, Riyadh, Jeddah, Khobar, Abha, Al-Hasa, Makkah, Madinah, Buraidah, Al Jubail, and Tabuk. The academic module is available every 2 weeks in Al-Hasa, while it is available weekly in Dammam, which is 150 km from Al-Hasa. Actually, this explains why students had to travel to other regions in order to meet with the nearest test date.
The cost of IELTS preparation courses
All the students who were interviewed indicated that they had to take some sort of preparation for the test. The majority of the students enrolled in preparation courses prior to sitting for the test. They did so since they were keen to pass the test in their first attempt so that they could save money. It seems that the motivation to attend these preparation courses were twofold. First, students wanted to pass the test having no intention to learn other things, especially the things in English for Academic Purposes (EAP) courses that were not related to the test. This is consistent with Green’s study (2007), who found that students tended to achieve faster gains when they were enrolled in dedicated IELTS preparation courses. More importantly, he found that IELTS preparation courses were different in focus from EAP courses, as they “directed learners away from their academic subjects and toward the topics and text types featured in the test” (Green, 2006, p. 364). Second, the students wanted to be ready for the test through such preparation courses in order to avoid paying more money should they fail the test. One of the students said that she enrolled in preparation courses because of the following reason:
It is shortcut...if you need to succeed on the test you need to be prepared...it is good for you to have at least a general idea about the test. You will lose money if you say I want to risk it and take the first test to familiarize myself and then repeat the test again... and this time you are more serious...I think you will repeat the test many times in this way and if you put this money in the preparation course you will benefit from it.
Some students indicated that there was not only a shortage of test centers but also places for test preparation. The nearest British Council or International Development Program (IDP) center was in Dammam, which was 150 km from Al-Hasa. In fact, the students had faith in the materials and the courses offered by these centers more than anything else offered in the market, and it is for this reason that they preferred to go there despite the availability of some commercial centers in Al-Hasa, who offered IELTS preparation courses. Yet, the students did not find the online materials offered by the British Council to be adequate. One student said that “they don’t give you enough training in the test because they want you to take the preparation courses they offer…I know why…they want to make money but not everybody can pay this much.” This actually echoes the findings of Ingulsrud’s study, who noted that “it goes without saying that the test preparation industry reflects economic inequalities in education: high-quality coaching is available only to those who can afford it (1994, p. 72).”
Most importantly, there was a major problem that the students faced prior to their registration for the test. In order to save money and perhaps to save face, they had to book a test date while preparing for the test to secure a place for them either in Al-Hasa, which is their home town, or the nearest place that had a test center. This is due to the high demands of the test. However, students indicated that they were not sure when they would be ready for the test and whether the training that they have received would be enough in helping them to achieve the required test score. This situation created a pressure on the part of the students, and they were compelled to study as hard as they could within the given time.
Also, some students reported that they found some fraudulent websites, claiming to be the British Council when they wanted to register for the test. So, they were very cautious to look for the correct spelling and materials, since they did not want to risk their money and future. One student indicated that she found an account in social media that sold IELTS test reports. She said that she read some posts on that account about people who have bought the test results from abroad and saved a lot of effort and money as well. She said as follows.
My whole future will be distorted if I do this...I am not that person who cheat and I think if I have had a false test report the university will accept me but my performance or level of proficiency is not going to help me to write a homework or a dissertation.
Luckily, this student had an awareness about the consequences of having forged test results.
The research reported in this study has shown that the IELTS exam created a negative economic washback on the participants, as they were financially affected by the test. The questionnaire as well as the interview results has shown that the test is expensive. Also, the cost of the test keeps rising from time to time. Those who had to repeat the test because they failed to achieve the minimum requirements, either in the overall score or the sub-scores (especially in the writing module), were the ones who were most affected financially by the price of the test. In fact, all students in the focus group repeated the test at least four times. This created a financial burden on their families, as most of them were dependent on their parents to pay for the cost of the test, and, thus, they were pushed to study harder each time. Likewise, travel expenses have influenced the participants negatively, as some of them had to travel into another region or country because the academic module was not offered in their test center every week. By the same token, preparation courses proved to be an additional financial burden on these participants, especially because they were not sure when they would be ready for the test and whether the training that they had received would be enough in helping them to achieve the required test score.
The results of the study, however, need to be interpreted with caution because the sample of this study was small. In addition, this had been conducted on one setting at one Saudi university. Therefore, the findings could not be generalized into other contexts. Furthermore, this study was limited to one research approach, namely, the qualitative approach. Accordingly, data collection tools were utilized based on this approach and, thus, were semi-structured. For a more comprehensive and accurate picture of the economic washback on IELTS candidates, future washback studies need to have a bigger representative sample of participants. Similarly, future research should consider other stakeholders so that multiple sources of evidence are included for the sake of generalization. In addition, future studies should make use of mixed methods approach (qualitative and quantitative) and include multiple data collection tools in order to triangulate the data for the purpose of validation.
The findings of this study have implications for IELTS stakeholders. In order to bring this financial washback to minimum, especially for test repeaters, the participants of this study recommended that the test price should be reconsidered, the test should be made modular, and the best result in each module in any two or more IELTS tests taken in the last 2 years should be accepted.
English for Academic Purposes
International Development Program
International English Language Testing System
Participant number 3
Saudi Arabia Riyal
Test of English as a Foreign Language
United States Dollar
Alderson, JC, & Wall, D. (1993). Does washback exist? Applied Linguistics, 14(2), 115–129.
Allen, D. (2017). Investigating Japanese undergraduates’ English language proficiency with IELTS: predicting factors and washback. IELTS Partnership Research Papers, 2. IELTS Partners: British Council, Cambridge English Language Assessment and IDP: IELTS Australia.
Bachman, LF (1990). Fundamental considerations in language testing. Oxford: Oxford University Press.
Barkaoui, K. (2017). Examining repeaters’ performance on second language proficiency tests: a review and a call for research. Language Assessment Quarterly, 14(4), 420–431.
British Council. (2017a). Test dates, fees and locations. Retrieved November 22, 2017 from https://www.britishcouncil.sa/exam/ielts/dates-fees-locations.
British Council. (2017b). Frequently asked questions. Retrieved November 27, 2017 from https://ielts.britishcouncil.org/FAQ.aspx#question’26’.
Broadfoot, P (1996). Education, assessment and society. Buckingham: Open University Press.
Brown, JDH (1998). An investigation into approaches to IELTS preparation, with particular focus on the academic writing component of the test. In S Wood (Ed.), IELTS research reports, (pp. 20–37). Sydney: ELICOS/IELTS.
Chappell, P., Bodis, A., & Jackson, H. (2015). The impact of teacher cognition and classroom practices on IELTS test preparation courses in the Australian ELICOS sector. IELTS Research Report Series, 6, 1–61. Retrieved from https://www.ielts.org/-/media/research-reports/ielts_online_rr_2015-6.ashx.
Cheng, L, Watanabe, Y, Curtis, A (2004). Washback in language testing. Mahwah, NJ: Lawrence Erlbaum Association, Inc.
Craven, E (2012). The quest for IELTS band 7.0: investigating English language proficiency development of international students at an Australian university. In J Osborne (Ed.), IELTS research report series (Vol. 13). Melbourne: IDP: IELTS Australia and British Council.
Davies, A, Brown, A, Edler, C, Hill, K, Lumley, T, McNamara, T (1999). Dictionary of language testing. In Studies in language testing 7. Cambridge: Cambridge University Press.
Elder, C, & O’Loughlin, K (2003). Investigating the relationship between intensive English language study and band score gain on IELTS. In R Tulloh (Ed.), IELTS research reports, (vol. 4, pp. 207–254). Canberra: IELTS Australia Pty Limited.
Gipps, C (1994). Beyond testing: towards a theory of educational assessment. London: Falmer Press.
Green, A. (2006). Watching for washback: observing the influence of the international English language testing system academic writing test in the classroom. Language Assessment Quarterly, 3(4), 333–368.
Green, A (2007). IELTS washback in context: preparation for academic writing in higher education. Cambridge: Cambridge University Press.
Hughes, A (2003). Testing for language teachers, (2nd ed., ). Cambridge: Cambridge University Press.
Humphreys, P, Haugh, M, Fenton-Smith, M, Lobo, A, Michael, R, Walkinshaw, I (2012). Tracking international students’ English proficiency over the first semester of undergraduate study. IELTS research reports online series (Vol. 1). Melbourne: IDP: IELTS Australia and British Council.
IELTS. (2017b). Why accept IELTS scores? Retrieved 10 November, 2017 from https://www.ielts.org/ielts-for-organisations/why-accept-ielts-scores.
Ingulsrud, JE (1994). An entrance test to Japanese universities: social and historical contexts. In C Hill, K Parry (Eds.), From testing to assessment: English as an international language, (pp. 61–81). London: Longman.
Lincoln, YS, & Guba, EG (1985). Naturalistic inquiry. Beverly Hills, CA: Sage Publications.
Macqueen, S, Pill, J, Knoch, U. (2016). Language test as boundary object: perspectives from test users in the healthcare domain. Language Testing, 33(2), 271–288.
Merrifield, G (2008). An impact study into the use of IELTS as an entry criterion for professional associations—Australia, New Zealand and the USA. In J Osborne (Ed.), IELTS research reports (Vol. 8). Canberra: IELTS Australia.
Merrifield, G (2011). An impact study into the use of IELTS by professional associations and registration entities: Canada, the UK and Ireland. In IELTS research reports (Vol. 11). Melbourne: IDP: IELTS Australia and British Council.
Merrifield, G. (2016). An impact study into the use of IELTS by professional associations in the United Kingdom, Canada, Australia and New Zealand, 2014 to 2015. IELTS research reports online. Retrieved 5 October, 2017 from https://www.ielts.org/-/media/research-reports/ielts_online_rr_2016-7.ashx.
Merrylees, B (2003). An impact study of two IELTS user-groups: candidates who sit the test for immigration purposes and candidates who sit the test for secondary education purposes. In R Tulloh (Ed.), IELTS research reports (Vol. 4). Canberra: IDP: IELTS Australia.
Mickan, P., & Motteram, J. (2008). An ethnographic study of classroom instruction in an IELTS preparation program. In J. Osborne (Ed.), IELTS research reports (Vol. 8). Retrieved 14 October, 2017 from https://www.ielts.org/-/media/research-reports/ielts_rr_volume08_report1.ashx.
Mickan, P, & Motteram, J (2009). The preparation practices of IELTS candidates: case studies. In J Osborne (Ed.), IELTS research reports (Vol. 10), (pp. 223–262). Canberra: IDP: IELTS Australia and British Council.
Moore, S, Stroupe, R, Mahony, P (2012). Perceptions of IELTS in Cambodia: a case study of test impact in a small developing country. In J Osborne (Ed.), IELTS research reports (Vol. 13). Melbourne: IDP: IELTS Australia and British Council.
O’Loughlin, K (2000). The impact of gender in the IELTS oral interview. In R Tulloch (Ed.), IELTS research reports (Vol 3), (pp. 1–28). Melbourne: IELTS Australia.
O’Loughlin, K, & Arkoudis, S (2009). Investigating IELTS exit score gains in higher education. In J Osborne (Ed.), IELTS research reports (Vol. 10), (pp. 95–180). London: Canberra and British Council.
O'Sullivan, B. (2011). Language testing: looking back and looking forward. A seminar Retrieved 25 September, 2017 from http://englishagenda.britishcouncil.org/continuing-professional-development/teacher-educator-framework/knowing-subject/language-testing-looking-back-and-looking-forward.
O'Sullivan, B, & Lu, Y (2006). The impact on candidate language of examiner deviation from a set interlocutor frame in the IELTS speaking test. In IELTS research reports (Vol. 6), (pp. 91–118). London: IELTS Australia, Canberra and British Council.
Penn State University (2017) IELTS—frequently asked questions from researchers. Retrieved 20 September, 2017 from http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.177.2074&rep=rep1&type=pdf.
Rao, C, McPherson, K, Chand, R, Khan, V (2003). Assessing the impact of IELTS preparation programs on candidates’ performance on the general training reading and writing test modules. In IELTS research reports (Vol. 5), (pp. 237–262). Canberra: IDP:IELTS Australia.
Read, J., & Hayes, B. (2003). The impact of IELTS on preparation for academic study in New Zealand. In IELTS research reports (Vol. 4). Retrieved 13 September, 2017 from https://www.ielts.org/-/media/research-reports/ielts_rr_volume04_report5.ashx.
Rea-Dickins, PR, Kiely, R, Yu, G (2007). Student identity, learning and progression: the affective and academic impact of IELTS on ‘successful’ candidates. In IELTS research reports (Vol. 7). Manchester: British Council and Canberra: IDP Education Australia.
Saville, N, & Hawkey, R (2004). The IELTS impact study: investigating washback on teaching materials. In L Cheng, Y Watanabe, A Curtis (Eds.), Washback in language testing: research contexts and methods, (pp. 73–96). Mahwah, NJ: Lawrence Erlbaum Associates.
Shohamy, E (2001). The power of tests: a critical perspective of the uses of language tests. Harlow: Longman.
Spolsky, B (1995). Measured words. Oxford: Oxford University Press.
Weir, C, O’Sullivan, B, Yan, J, Bax, S (2007). Does the computer make a difference? The reaction of candidates to a computer-based versus a traditional hand-written form of the IELTS writing component: effects and impact. In P McGovern, S Walsh (Eds.), IELTS research reports (Vol. 7), (pp. 311–347). Canberra: British Council & IELTS Australia.
The author is so grateful to the editor-in-chief and the anonymous reviewers for their constructive feedback on prior versions of this article. Special thanks are due to the participants of this study.
The author read and approved the final manuscript.
Availability of data and materials
Data will not be shared because they will be used in other publications.
The author declares that he has no competing interests.
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.