Skip to main content

An exploration of students’ voices on the English graduation benchmark policy across Northern, Central and Southern Vietnam


Despite the wide implementation of English graduation benchmark policy in Vietnam, students’ voices on this one-size-fits-all policy have still been under-researched and under-represented. The purpose of this research, therefore, is twofold. The first is to examine whether any differences actually exist among Northern, Central and Southern Vietnamese students’ voices on the policy because these three regions represent heterogeneous characteristics ranging from cultures to English learning conditions to socio-economic backgrounds. The second is to investigate how these students at elementary, intermediate and upper intermediate levels perceive the policy. Drawing a sample of 902 students in these three regions, the study adopted a quantitative approach in the form of a questionnaire survey. One-way analysis of variance (ANOVA) and Hochberg’s GT2 post hoc tests yielded significant differences in students’ voices on the ‘Benefits’ and ‘Anxiety’ factors. Following the English proficiency levels in each region, a multivariate analysis of variance (MANOVA) and the Bonferroni method also showed variations in students’ voices on the ‘Benefits’, ‘Anxiety’ and ‘Test-oriented learning’ factors. The findings hope to give dissenting voices to stakeholders including the Ministry of Education and Training (MOET), policy makers, higher education institutions (HEIs) and teachers to adjust the policy and come up with innovative pedagogical strategies.


Coupled with the burgeoning demands for English proficiency, the MOET ushered a deluge of top-down and bottom-up reforms, particularly the National Foreign Languages Project (NFLP) 2020 with a budget of VND 9.4 trillion ($443 million) targeting English to become an advantage of Vietnamese graduates by 2020 (Bui & Nguyen, 2016). Since Vietnamese tertiary students’ working knowledge of spoken and written English has been receiving mounting criticism from employers and society (Nguyen, 2013; Tran & Marginson, 2018), the NFLP 2020 has contributed to raise their awareness of improving English language competence. As the third prominent task in the NFLP 2020, there have been considerable efforts to substantially reform the practices of language testing and assessment (Nguyen, Nguyen, Van Nguyen, & Nguyen, 2018). This includes the development of a national language proficiency framework compatible with the Common European Framework of References for Languages (CEFR) called Vietnam Foreign Language Framework (VFLF) (Le, 2017; Nguyen et al., 2018; Phuong, 2017; Van Huy & Hamid, 2015). Accordingly, Vietnamese non-English major students are required to reach level 3 of the VFLF which is equivalent to level B1 CEFR in order to be awarded the bachelor’s degree (Phuong, 2017). Under this policy, students are able to choose an array of English language proficiency tests (ELPTs) developed by both domestic and international testing institutes such as the Vietnamese Standardised Test of English Proficiency, the International English Language Testing System (IELTS), the Test of English for International Communication (TOEIC), the Test of English as a Foreign Language (TOEFL), Cambridge Preliminary Test (PET) and First Certificate of English (FCE) (Le, 2017; Nguyen, 2013). Vietnamese non-English major students, however, still prefer opting for IELTS, TOEIC, TOEFL, PET or FCE because of their recognised international credibility (Le, 2017).

In stark contrast to the potential rosy picture painted by the wide implementation of ELPTs as a graduation benchmark policy, a relative number of students have been able to fulfil the requirements as indicated in their scores of ELPTs administered either nationally or internationally (Le, 2017). Without closely examining the deep-rooted nature associated with the teaching and learning English in Vietnam, the efforts of adopting the CEFR, producing the VFLF and implementing the English graduation benchmark policy are argued to be over-ambitious and unachievable for students (Hoang, 2010; Nguyen et al., 2018; Van Huy & Hamid, 2015).

The outdated teaching and testing in classrooms are the first underlying reasons because of heavily focusing on grammar accuracy and written structures (Le Ha, 2009; Phuong & Nhu, 2015). Students lay the main emphasis on memorising grammatical rules and improving grammatical accuracy (Phuong, 2017) while teaching approach becomes more test-oriented due to teachers’ pressure for students’ success rates (Lan, 2017). This has provided few opportunities for students to practise and improve their listening and speaking skills; consequently, they perform poorly on the ELPTs testing four macro skills (i.e. speaking, listening, writing and reading) and find difficult to fulfil the graduation threshold of English proficiency (Phuong, 2017). They also have little confidence in applying their knowledge of language systems (e.g. vocabulary, grammar structures) for communicative purposes and lack motivation to learn English (Phuong & Nhu, 2015).

In addition, Vietnamese students’ English proficiency levels still remain limited (Nguyen, 2018; Ton & Pham, 2010; Trang & Baldauf Jr, 2007). As reaching a certain level of English proficiency maintains strong correlations with both students’ positive attitudes towards the policy and their sustained efforts to satisfy the policy (Chen & Liu, 2007; Pan, 2015; Shih, 2008; Tsai & Tsou, 2009; Wu & Lee, 2017), the limited English language skills of Vietnamese non-English major students are far from the satisfactory of testing and assessment requirements.

Another different view—supported by Shohamy, Donitsa-Schmidt, and Ferman (1996), Shih (2007), Wu and Lee (2017) and others—emphasises that a fixed standard in implementing ELPTs may not be suitable for every student. As stated by Hoang (2010), the adoption of ELPTs is only available for those planning to pursue further education in English-speaking countries. The problem with this one-size-fits-all approach is that not all test-takers adopt desirable attitudes towards the policy as well as have sufficient knowledge and learning resources to satisfy test demands (Bailey, 1996; Xiang-Dong, 2007). They argue that the policy does not take all aspects of testing and characteristics of test-takers into rigorous consideration. In Vietnam, this is true because students in three regions including Northern, Central and Southern Vietnam (NOR, CEN and SOU, respectively) represent heterogeneous characteristics ranging from cultures to English learning conditions to socio-economic backgrounds. The disparities between rural areas and more economically developed urban areas have left big gaps in educational opportunities and academic accomplishment (Bui & Nguyen, 2016). For instance, in more economically developed areas (e.g. Ho Chi Minh City, Ha Noi, Da Nang, Vung Tau, Hai Phong), the mastery of English language is a top priority, whereas students in economically disadvantaged areas face radical challenges with the requirements of English proficiency. Learners in rural and low socio-economic areas have limited opportunities to learn English due to insufficient materials or low quality of teaching. Moreover, most of the minority students coming from remote and mountainous areas experience insurmountable difficulties in different stages of learning English including primary, secondary and tertiary levels of education (Nguyen & Bui, 2016). Even though the MOET has introduced policies which are directly related language and education for ethnic minority students, most of them encounter language barriers. This is because they both need to learn Vietnamese as a second language and learn English as compulsory subject (Nguyen et al., 2018). For instance, an ethnic minority student in Nhan (2013)’s study voices that ‘If a TOEIC certificate is required, it is as if the university gave me a freeway in but no way out’ (p.38). Tracing back to previous studies (McNamara, 2012; McNamara & Ryan, 2011) when ELPTs are argued to ignore the views of test-takers and not to reflect upon the real-life situations, the heterogeneity among NOR, CEN and SOU might arouse doubts over the effective implementation of the English benchmark policy in Vietnam.

Despite the ever-increasing use of ELPTs as a graduation condition in Vietnam, a research gap emerging from the literature is that students’ voices about the benchmark policy have been heard little. There has been hitherto only one piece of empirical evidence seeking Vietnamese students’ opinions stemming from the policy (i.e. Nhan, 2013); however, this endeavour has limitations in drawing conclusions from the methodology. Question of trustworthiness arises when the author fails to analyse and report empirical findings related to the employed questionnaire. In order to thoroughly examine how Vietnamese administrators, teachers and students at universities and colleges located in a city perceive the use of TOEIC as a graduation requirement, three sets of questionnaires including closed-ended and open-ended questions are distributed for data collection. Although Nhan (2013) emphasises that participants’ responses would be selected to underline her arguments rather than being analysed quantitatively, findings presented are solely based on personal communication which is not stated as a data collection instrument. In other words, this means that there is a scarcity of evidence on how data are collected and analysed (Anney, 2014).

With these limitations in mind, the purpose of this present research is conceived to explore Vietnamese students’ voices on the English graduation benchmark policy by drawing a sample of 902 students in NOR, CEN and SOU and examining whether any differences actually exist among students in these regions. This attempt, according to Su (2005), is on a critical demand as the graduation requirement should be based on the analysis of students’ needs. Besides, several studies have recently expounding conflicting views about how students’ English proficiency level influences on their perceptions towards the policy (Chen & Liu, 2007; Hsu, 2009; Shih, 2008; Tsai & Tsou, 2009; Wu & Lee, 2017). Wu and Lee (2017) have recently called for further empirical research sampling elementary students (equivalent to level A2 CEFR) in order to construct a clear picture of how English proficiency influences on students’ attitudes towards the English graduation policy. Thus, this study contributes to the existing literature by examining how Vietnamese students in NOR, CEN and SOU whose English proficiency levels are elementary (ELE), intermediate (INT) and upper intermediate (UPP) perceive this graduation policy. Since these students achieved either IELTS or TOEIC results by the time they took part in this research, their success or failure in satisfying the requirements from their HEIs would make significant contributions to understand how Vietnamese students with different English proficiency levels voice on the graduation threshold of English proficiency.

Noting that Vietnamese students’ voices on the English graduation benchmark policy have been under-researched and under-represented, the findings hope to give dissenting voices to stakeholders including policy makers, teachers and even test developers in Vietnam to adjust the policy and come up with innovative pedagogical strategies. These actions not only foster Vietnamese students’ English proficiency but also implement the policy successfully. The results could serve as a source of reference for other countries in case they plan to adopt a similar English graduation benchmark policy.

Literature review

Reasons behind Vietnamese tertiary students’ limited English proficiency

In order to be eligible for university admission, Vietnamese students must sit the national high school graduation examination by taking three compulsory exams (i.e. Mathematics, Vietnamese Literature and English) and two optional composite tests (i.e. natural sciences including Biography, Chemistry, Physics and social sciences including History, Geography, Civic Education). Even though English is one of the three compulsory subjects alongside with Mathematics and Vietnamese literature, high school students’ performance in English is low. According to the annual report launched by the MOET in 2018, 814,779 students took the English test consisting of 80 multiple choice items on English phonetics, vocabulary, grammar, writing and reading comprehension. Nevertheless, the statistics showed that 637,335 students (78.22%) poorly scored below 5 on a scale of 10. This could be explained as students utterly devote efforts on the specific subjects in the fixed group following the requirements of their major choices (e.g. Group A: Mathematics, Physics, Chemistry; Group B: Mathematics, Biology, Chemistry; Group C: Literature, History, Geography; Group A1: Mathematics, Physics, English; Group D1: Literature, English, Mathematics). Sharing certain similarities with the other students in exam-oriented cultures (e.g. China, Japan, Taiwan or South Korea), Vietnamese high school students have a strong tendency and obsession with the successful achievement in this competitive admission exam. As a part of Confucian legacy, they have a long tradition of treasuring academic success because it influences not only on their own sake but also on their family (Li & Li, 2010). This offers for the fact that, rather than improving English skills, they entirely adopt exam-driven learning approaches in mastering profound knowledge related to the subjects required by their major choices. In addition, to prepare for this English high-stakes exam, Vietnamese high school students are sent to cram schools to focus on memorising a large amount of grammar rules instead of practising communication skills. Consequently, Vietnamese non-English major students lack adequate preparedness for their English learning, particularly they are confronted with the communicative tasks when proceeding to higher education (Van Canh & Barnard, 2009).

Added to this, curriculums at primary and secondary levels of English language education in Vietnam are argued to show inconsistency throughout different grades, resulting in repetitiveness and inefficiency that students learn similar content in different grades (Phuong & Nhu, 2015). For instance, students learning English for grade 6 (12-year-old students) and grade 10 (16-year-old students) are instructed to make personal self-introduction by using present simple tenses. This is considered as a demotivating factor, which might exert widespread negative learning outcomes (Sakai & Kikuchi, 2009).

Another reason given for Vietnamese students limited English competence has been the quality of teachers because their English proficiency is far from satisfactory (Dudzik & Nguyen, 2015; Bui & Nguyen, 2016; Van Canh & Renandya, 2017). The NFLP 2020 requires primary and lower secondary teachers to reach level B2 CEFR while upper secondary teachers are expected to reach level C1 CEFR; however, 83% of primary teachers, 87.1% of lower secondary teachers and 91.8% of upper secondary teachers could not meet the required benchmark level (Nguyen, 2013).

Having such inadequacy of English language skills, Vietnamese non-English major students, when proceeding to higher education, lack preparedness for their own learning (Van Canh & Barnard, 2009). At tertiary level, each individual HEI has its own authority to decide what to teach based on the general guidelines from the MOET, promoting the diversity in teaching contents on the one hand but leading to chaos. The most popular English language programme offered by HEIs is general English aiming to foster students’ four macro skills; however, these courses solely consist of 90–210 classes in which students are exposed to basic English communication grammar, reading, and basic English communication (Bui & Nguyen, 2016). Materials imported from English-speaking countries might not be appropriate for Vietnamese students’ social, cultural and educational backgrounds in some cases (Phuong, 2017). Other HEIs implement English for Specific Purposes courses, but materials focusing on reading comprehension and vocabulary exercises provide little chance for students to practise their language skills (Hoa & Mai, 2016). For example, universities and colleges located in the Mekong Delta of Vietnam are free to develop teaching curriculums and choose materials, but there has been a scarcity of standardisation erecting barriers for students from these HEIs to achieve the anticipated outcomes set by the MOET (Phuong, 2017). With different levels of English proficiency, these students are placed in the same classrooms. They only have approximately 150–180 instructional hours (45–50 min/h) to practise and improve their English competence in the first 2 years (four semesters) at their HEIs, whereas 180–200 and 350–400 guided learning hours are suggested to reach levels A2 and B1 CEFR, respectively (Desveaux, 2013). In a similar vein, Nguyen, Fehring, and Warren (2015) postulate that students coming from rural and remote areas feel demotivated when learning English in the same class with more proficient counterparts. In their study, students improve their English proficiency by using American Headway series in five semesters; however, the amount of time allocated is insufficient for the development of four macro skills. Guided learning hours in each semester is only enough for teachers to instruct students in grammar, vocabulary and reading. Without sufficient opportunities for English exposure, students not only show less willingness to increase their English competence and sustain learning interests, but also receive limited reinforcement of knowledge (Baurain, 2010; Nguyen et al., 2015; Su, 2006). Consequently, their learning interests are not simulated even though they realise the paramount importance of English language fluency and mastery for their future career (Ton & Pham, 2010).

The impacts of high-stakes English proficiency testing on learning

Upon graduation, Vietnamese non-English major students are not able to use English effectively in communication or job interviews regardless of learning English for seven years at schools and three or four years at the tertiary level (Ngoc & Iwashita, 2012; Tran, 2013). The implementation of high-stakes English proficiency tests, thus, is expected to provide Vietnamese students incentives to learn English seriously because changes of students’ learning run parallel to the changes of assessment methods (Brown, Bull, & Pendlebury, 2013). Serving for gate-keeping purposes (Chen & Tsai, 2012), the ELPTs have become high stakes and been perceived as ‘effective tools for controlling educational systems and prescribing the behaviours of those who are affected by their results’ (Shohamy et al., 1996, p. 229). The influences of testing on students’ learning are referred as washback effects (Bailey, 1996; Cheng & Curtis, 2012; Cheng & Watanabe, 2004; Wall, 2013). The two key facets of these washback effects include, on the one hand, positive washback if tests are well designed and appropriately used and, on the other hand, negative washback if tests exert negative impacts on students’ learning (Bailey, 1996; Shohamy, 2014; Tsagari & Cheng, 2016; Xie & Andrews, 2013; Zhan & Andrews, 2014).

In terms of advantages, Pan (2015) postulates that the policy of using high-stakes ELPTs generate positive motivation and display a sense of accomplishment among students. Test-takers are extrinsically motivated to devote more considerable efforts, with regards to this motivational policy (Deci & Ryan, 2016), in pursuit of altering their learning behaviours and improving their test performance (Cavendish, Márquez, Roberts, Suarez, & Lima, 2017). Referring to the current situations of learning English in Vietnam, high-stakes ELPTs may create potential impacts on students’ learning progress because they realise the levels of their own abilities and have awareness of knowledge or skills they need to improve (Stecher, 2010). For those attaining limited English proficiency or feeling demotivated in learning English at high schools, employing these high-stakes ELPTs facilitates and assesses students’ learning outcomes by diagnosing their strengths and weaknesses (PoPham, 2014).

Research has shown that the adoption of high-stakes ELPTs as a graduation requirement brings enormous advantages to students’ interests in learning English. Su (2005), in a survey of 539 Taiwanese students at a technological university, finds that 50% of students agree with the English certification requirement because they reap substantial benefits for their future job prospects and education. In line with the positive findings in Su’s (2005) study, Pan (2009) further reports that 63% of students support the English graduation benchmark policy as they evince more interests in learning English and expend sustained efforts in improving language skills. Such findings are similarly echoed by Pan and Newfields (2012) who find that students at Taiwanese HEIs with the policy become highly motivated to learn English than those at HEIs without the policy. These students also employ different language learning strategies and devote considerable time learning English under the policy.

In Vietnam, Nhan (2013)’s study shows that the English graduation benchmark policy displays commitments of universities and colleges on producing desired outcomes of future workforce. It helps key stakeholders including HEIs, teachers and administrators to re-evaluate their curriculums and make necessary adjustments. Congruent with other empirical evidence (Tsai & Tsou, 2009; Shih, 2012, 2013; Chu & Yeh, 2017; Wu & Lee, 2017) contextualised in China and Taiwan, Nhan (2013)’s study reveals that the implementation of high-stakes English proficiency tests facilitates both students’ positive perceptions and intrinsic motivation of improving their English competence and having brighter prospects for their future career and further education.

Although it is plausible to reason that high-stakes tests motivate students with ‘a way to construct meaning regarding learning’ (Roderick & Engel, 2001, p. 219), a large volume of literature has warned that it decreases intrinsic motivation, focuses on test-taking strategies and provokes negative feelings, such as, anxiety, stress and worries (Chen, 2012; Damer & Melendres, 2011; PoPham, 2011; Ryan & Weinstein, 2009; Woodrow, 2011). This is due to cognitive performances that students experienced when they are genuinely worried about failing a test, having little self-confidence, and preparing for the assessment (In’nami, 2006). They portrayed themselves as anxious and pessimistic test-takers when they were asked to draw a self-portrait in testing situations. Another problem that has the potential to make high-stakes tests more negative is that it undermines students’ intrinsic motivation (Ryan & Weinstein, 2009), making them become more anxious and spare no efforts in learning (Deci & Ryan, 2016; Roderick & Engel, 2001). This can ‘make learners get discouraged, lose faith in their abilities, escape from participating in classroom activities, and even give up the effort to learn a language well’ (Na, 2007). Consequently, students place heavy emphasis on merely achieving acceptable test results without developing sufficient English language skills (Brown & Abeywickrama, 2010; Chapelle, Enright, & Jamieson, 2011). In the same vein, high-stakes tests produce superficial changes that do not awaken students’ natural learning curiosity.

Under the graduation policy, the accumulated empirical data also suggest contradictory results for the effectiveness of ELPTs on students’ learning process. Mismatches, recommended by Tsai and Tsou (2009), were found when the policy was not sufficient to clearly reflect what is learnt and taught in a language classroom. This leaves students with the feelings of unfairness. Indeed, the inherent problems of one-size-fits-all assessment lead to test-driven orientation in teaching and learning English (Gipps, 2002). Accordingly, there has been an increasing source of dissatisfaction among students who are under pressures of test preparation and cramming (Wang & Liao, 2012). Similarly, Chen and Squires (2010) state that the policy increases students’ test anxiety. They suggest that the English graduation benchmark policy should be re-examined and differently applied for specific career paths and academic disciplines. For example, business-related majors perceiving English as an indispensable factor in their future workplace would be less anxious about the policy compared with those whose majors are engineering or technology. Similarly, Tsai and Tsou (2009) reveal that the English graduation requirement should not be employed as the sole assessment method and a prerequisite of degree conferral. In Vietnam, there has been a marked decline in graduation rates after the policy was established, for example, around 1500 university students in a university located in Da Nang are illegible to be awarded their bachelor’s degree (Nhan, 2013). They also voice growing concerns for the mismatches between the test contents and curriculums at their institution.

The level of English proficiency and students’ voices on the English graduation policy

Several studies contextualised in Taiwan have recently been expounding conflicting views about how students’ English proficiency level influences on their perceptions towards the policy (Chen & Liu, 2007; Hsu, 2009; Shih, 2008; Tsai & Tsou, 2009; Wu, 2012; Wu & Lee, 2017). English proficiency is claimed to maintain a strong correlation with the attitudes that students hold towards the policy (Chen & Liu, 2007; Shih, 2008). Both English proficiency and learning attitudes play focal roles in showing how students change for the ELPTs (Pan & Newfields, 2012). This means that higher proficiency students have more positive motivation for the policy because it enables them to achieve success in language learning (Dörnyei, 2009). They believe that ELPTs evaluate their level of English proficiency, so they display increasing willingness to study for and take ELPTs (Gan, Humphreys, & Hamp-Lyons, 2004). By contrast, those English level is lower are likely to lose their motivation or have little confidence due to unrelenting pressure and mounting anxiety (Chu, 2009). The English graduation benchmark policy might become an unattainable goal or even a burden for these students as it is a detriment to impede their learning interests (Pan & Newfields, 2012). Consequently, they assume that their learning merely serves for primarily test-driven purposes (Hsu, 2009; Tsai & Tsou, 2009).

In a similar vein, it has been indicated that the policy is often compromised by students with lower English proficiency level who express negative language learning motivation and report high levels of test-induced fear and anxiety (Liu, 2012). As a result, the policy could be an unattainable goal for low-achieving students. On the other hand, Liauh (2011)’s study depicts that students with lower English proficiency are more positively influenced than those with higher English level when both levels evaluate the effectiveness of the graduation benchmark policy. A recent piece of research undertaken by Wu and Lee (2017) indicates that students, regardless of whether their English proficiency is classified as intermediate or high intermediate level, all take positive attitudes to the use of ELP as a university graduation requirement. Their findings, interestingly, show that students’ attitudes of the graduation policy in the intermediate group are much more positive compared with those in the high intermediate group. They further pinpoint limitations of recruiting high intermediate students and call for more investigation of elementary students because high intermediate students tend to demonstrate their strong learning motivation and autonomous learning behaviours. Their attitudes towards the English graduation benchmark policy, apparently, appear positive and optimistic.


Research design and participants

The study adopted a quantitative approach in the form of questionnaire survey conducted with Vietnamese students from different universities located in NOR, CEN and SOU. The online questionnaire was created in Google form and the link was posted on Facebook to call for voluntary participation. In this study, the web-based survey design was appropriate as it allowed for rapid deployment of surveys for respondents who were geographically disbursed (Gall, Gall, & Borg, 2007). The survey link was also promoted to students through contacting teachers and lecturers at different universities across Vietnam to spread the information. This provides access to more groups of students who would be normally difficult to reach through other channels, particularly those living in or having backgrounds from mountainous or remote areas (Table 1).

Table 1 Participant demographic information

In all, 902 students from NOR, CEN and SOU completed the survey. Of the students sampled, 34.04% (n = 307) students were NOR, 31.04% (n = 280) students were from CEN and 34.92% (n = 315) students were from SOU. All participants successfully achieved either IELTS or TOEIC and indicated that their results reached ELE/CEFR A2 (IELTS 3.0–4.5, TOEIC 225–545), INT/CEFR B1 (IELTS 4.0–5.0, TOEIC 550–780) or UPP/CEFR B2 (IELTS 5.5–6.5, TOEIC 785–940) levels. A total of 48.89% of the students reached ELE, another 25.61% reached INT and 25.50% reached UPP. Across the three regions, the majority belonged to ELE, specifically 41.69% in NOR, 61.07% in CEN and 45.07% in SOU. There was an underrepresentation of INT in NOR (19.54%), whereas UPP was underrepresented in both CEN (19.29%) and SOU (18.10%).

Questionnaire design and pilot study

After reviewing relevant literature and prior empirical evidence (e.g. Tsai & Tsou, 2009; Pan & Newfields, 2011, 2012; Shih, 2012, 2013; Hsieh, 2017; Wu & Lee, 2017) related to students’ voices on the English graduation benchmark policy, a questionnaire survey was developed with two sections. The first section contains respondents’ demographics (i.e. gender, year of study, location of HEIs, English proficiency level and IELTS/TOEIC results). For the second section, the respondents received 21 English graduation policy-related questions based on a five-point Likert scale ranging from strongly disagree (1 point) to strongly agree (5 point). The survey questionnaire was written in English, translated into Vietnamese and then checked by two Vietnamese English teachers who had experience in translation and interpretation. This ensured that participants would not encounter any difficulties in reading and understanding the questionnaire and be able to choose their answers. Prior to the main survey, a pilot study was undertaken to confirm the reliability of the scale and the validity of the content of the questionnaire. The questionnaire was sent to a group of 50 students studying at universities located in NOR, CEN and SOU. The reliability (alpha) coefficient for these questions was 0.905, which was deemed to be sufficiently reliable for investigating students’ voices on the English graduation benchmark policy.

Results and discussions

Dimensions of students’ voices on the English graduation benchmark policy

Exploratory factor analysis (EFA) using principal components extraction was employed to explore the underlying dimensions of the 21 variables driving Vietnamese students’ voices on the English graduation benchmark policy. As a statistical method, EFA aims to improve the reliability of the scale by both removing inappropriate items and determining the dimensionality of constructs (ten Holt, van Duijn, & Boomsma, 2010). The complete set of data obtained a Kaiser-Meyer-Olkin measure of sampling adequacy of 0.860 and a 0.000 significance level in the Bartlett sphericity test, meaning that the factorability was properly evaluated and EFA could be applied to the dataset (Kaiser, 1974). Exceeding the minimum value (0.60), the factor analysis was indicated to be a useful validation of the scale (Churchill Jr, 1979). The criterion on meaningful factor loadings was set to 0.50 because the factor loading should be greater than 0.50 for practical significance (Hair, Anderson, Tatham, & William, 1998). In this study, factor loadings were found to range from 0.559 to 0.930 on the three domains, indicating that each variable contributed to establishing the factor structure (Hair, Ringle, & Sarstedt, 2011). Using these criteria resulted in three principal components which are labelled as factor 1 ‘Benefits’, factor 2 ‘Anxiety’ and factor 3 ‘Test-oriented learning’.

Factor 1 ‘Benefits’ displayed good internal consistency (α = 0.838) and was composed of nine items capturing students’ opinions towards how the English graduation benchmark policy exerted positive impacts on their learning, such as generating motivation, fostering English proficiency level, enhancing competitiveness in future workplace and education, directing learning or coping with the alarming issues in English proficiency in Vietnam. Having good reliability (α = 0.820), factor 2 ‘Anxiety’ was composed of eight items referring to students’ perceptions on the anxiety that the policy provoked, for example, placing mounting pressure on learning, increasing worries or exerting negative impacts on future. Factor 3 ‘Test-oriented learning’ had meritorious reliability (α = 0.957) and included four items, delineating how students devoted efforts to improve test-taking strategies, learn content related to the ELP tests or do test-oriented practice in four macro skills. The three factors to measure Vietnamese students’ voices on the policy were all reliable, indicating that the measurements might be applicable to other similar studies. The general reliability of the factor analysis for 21 items was 0.843. Results of component extract are shown in Table 2.

Table 2 Principal component analysis for the 21-item questionnaire survey (n = 902)

Confirmatory factor analysis

The 21 items were subsequently validated by calculating the average variance extracted (AVE) and construct reliability (CR), and satisfactory results were obtained: root mean square error of approximation (RMSEA = 0.075 < 0.080), Tucker-Lewis Index (TLI = 0.903 > 0.900) and comparative fit index (CFI = 0.910 > 0.900) (Brown, 2006). Table 3 shows CR and AVE of three latent variables including ‘Benefits’ (CR = 0.861, AVE = 0.407), ‘Anxiety’ (CR = 0.863, AVE = 0.443) and ‘Test-oriented learning’ (CR = 0.939, AVE = 0.794). The values of these three variables indicated acceptable validity because CR ≥ 0.60, AVE ≥ 0.40 and high factor loadings (> 0.50) ensure adequate construct validity (Fornell & Larcker, 1981; Gerbing & Anderson, 1988; Fraering & Minor, 2006).

Table 3 Results of confirmatory factor analysis (N = 902)

Group differences in the English graduation benchmark policy

To assess any differences in students’ voices between the three regions under the study, one-way ANOVA was conducted. As presented in Table 4, the analysis yielded significant differences in student’ perceptions among the three groups regarding factor 1 ‘Benefits’ (F = 3.301, p = 0.037) and factor 2 ‘Anxiety’ (F = 54.958, p = 0.000), whereas no significant differences were found between NOR, CEN and SOU students with regards to factor 3 ‘Test-oriented learning’ (F = 0.792, p = 0.453). Hochberg’s GT2 post hoc tests were then utilised to examine the differences between pairs of groups because of its appropriateness of unequal sample sizes (Field, 2005).

Table 4 Comparisons between three groups by two factors (ANOVA and Hochberg’s GT2 post hoc test results) ***p < 0.01, **p < 0.05, *p < 0.1

Regarding factor 1 ‘Benefits’, Hochberg’s GT2 post hoc test results demonstrated that SOU students perceived more benefits related to the policy than CEN students (p = 0.056). The finding could be explained by the fact that SOU has witnessed macroeconomic stability and the development of foreign direct investment; therefore, SOU students realised the influence and importance of learning English and getting ELPTs in order to take advantage of opportunities for securing jobs in multinational and international companies, and getting promoted in their workplace (Pan, 2015; Pan & Block, 2011).

The knowledge-based economy demands a solid command of English proficiency for SOU students, so the English graduation benchmark policy reflects the importance attached to English because English becomes not a meaningful communication tool but an essential prerequisite for their future employment. This is also the case in big cities located in China, South Korea, Japan or Taiwan as the good level of English proficiency is an indispensable asset for their learning and employment (Tsui & Ngo, 2017). Learning English at Vietnamese high schools, as mentioned earlier, focuses on memorising grammar rules and practising reading comprehensions; thus, using ELPTs as a graduation condition motivates students to strive for better performance and become more responsible for their own learning (Dickinson, 1995; Su, 2005; Pan, 2009; Pan & Newsfield, 2011). The policy, in the Vietnamese context, provides a stimulus and orients SOU students towards the development of productive skills.

The finding about the disparities between SOU and CEN students’ perceptions related to factor 1 ‘Benefits’ is in accordance with findings reported in prior studies (e.g. Kormos & Kiddle, 2013; Lamb, 2012). They confirm that the differences in socio-economic situations exert profound impacts on motivation and language learning. SOU students living in urban and more developing contexts are significantly more motivated than CEN students coming from disadvantaged and less developing parts. Tracing back to low-resources areas where English provision is difficult, CEN students could easily find themselves a disadvantaged position when meeting the English graduation benchmark requirement to complete their studies. ‘English opens doors, yes, but it closes others. English is an open sesame for some people and some purposes, but it serves to condemn others to poverty and oblivion’ (Skutnabb-Kangas, Phillipson, Panda, & Mohanty, 2009, p. 327).

In terms of factor 2 ‘Anxiety’, Hochberg’s GT2 analysis revealed a hierarchical relationship as scores for NOR students were higher than those for CEN and SOU students (p = 0.041 and p = 0.001), while these scores for CEN students were higher than those for SOU students (p = 0.001). Compared to CEN and SOU, NOR is known as the origin of Vietnamese culture, so NOR students have suffered more the pervasive influences of Confucianism. Since their educational success is vital to their future success, NOR students experienced strong pressure to academically excel in their studies including passing the ELPTs and graduating from HEIs. These expectations and pressure, found in previous studies contextualised in Singapore (Ang & Huan, 2006; Huan, See, Ang, & Har, 2008; Li, Ang, & Lee, 2008; Luo, Paris, Hogan, & Luo, 2011), may result in how NOR students experience higher level of anxiety when high-stakes tests are used. Furthermore, job competitiveness in NOR and CEN appears more intense and fiercer than seeking employment opportunities in SOU. This puts NOR and CEN students under undue pressure to not only satisfy the gate-keeping purpose of using ELPTs but also perform well on these high-stakes tests. NOR and CEN students experience the same stress and pressure with Singaporean and South Korean students to perform well on the high-stakes exams because passing these tests places a crucial role in their future, specifically pursuing well-paid jobs with high social status (Ang et al., 2009; Kim, 2010; Kim & Kim, 2016). Particularly in NOR, students achieving better results in ELPTs are having more opportunities to find stable job in big cities (e.g. Ha Noi and Hai Phong). Undoubtedly, their anxiety levels are increased (Barksdale-Ladd & Thomas, 2000). Also, failing to achieve satisfactory scores from the ELPTs and graduate from HEIs places NOR students under pressure of learning English as they are filled with shame and loss of face (Ang et al., 2009; Wong et al., 2005).

In comparison between CEN and SOU students, CEN students were found to be less motivated than SOU students, but CEN students suffered from more anxiety than SOU students. This aligns closely with previous studies (Hsu, 2004; Liu, 2012) because students who are less motivated to learn English tend to experience more anxiety during their learning process.

Regions × English proficiency interaction

Relations between English proficiency levels and students’ voice on the English graduation benchmark policy in NOR, CEN and SOU were assessed via a 3 (English proficiency levels) × 3 (regions) MANOVA. English proficiency levels (i.e. ELE, INT, UPP) and regions (i.e. NOR, CEN, SOU) were independent variables, and the three factors were dependent variables. Significant differences were found among the three levels of English proficiency for the three factors (Wilks’ λ = 0.974, F = 15.850, p = 0.000, η2 = 0.013). As displayed in Table 5, results from the analysis revealed that the main effect of English proficiency was significant for the region × English proficiency interaction regarding factor 1 (F = 2.497, p = 0.082, η2 = 0.001), factor 2 (F = 39.607, p = 0.000, η2 = 0.022) and factor 3 (F = 5.518, p = 0.004, η2 = 0.003). This indicated that students in different English proficiency levels differed in their voices on the English graduation policy. To investigate the interaction for each factor, a series of follow-up comparisons were conducted, using the Bonferroni method (Table 6).

Table 5 Tests of between-subjects effects. ***p < 0.01, **p < 0.05, *p < 0.1
Table 6 Pairwise comparisons with a Bonferroni method. ***p < 0.01, **p < 0.05, *p < 0.1

For factor 1‘Benefits’, UPP students in SOU gained more motivation from the implementation of the policy than both ELE and INT students (p = 0.001, p = 0.020), whereas no mean differences were found among NOR and CEN students. Contrary to the findings of Wu and Lee (2017) indicating that lower proficiency students show more motivation towards the graduation requirement than higher proficiency students, UPP students in SOU had more motivation and accepted the policy better. Ghaith and Diab (2008) argue that lower-proficiency students are more willing to exert efforts in their learning process; on the other hand, high proficiency learners are extrinsically motivated as the policy enables them to achieve success in language learning and passing the ELPTs (Chong & Kim, 2001; Dörnyei, 2009). Compared to lower-proficiency counterparts, they are more engaged in learning activities for their own sake (Lee, 2005). High-proficiency group are more intrinsically motivated when their learning becomes ‘inherently interesting or enjoyable’ (Ryan & Deci, 2000, p. 55), whereas lower-proficiency group are found to be easily demotivated (Sakai & Kikuchi, 2009).

With regards to factor 2 ‘Anxiety’, the policy was found to provoke higher levels of anxiety for ELE students in NOR and CEN than those whose levels were INT and UPP (p = 0.000, p = 0.000; p = 0.094, p = 0.000). No mean differences were found between the three levels of English proficiency in SOU. Overall the findings in NOR and CEN are in accordance with recent findings reported by Wu and Lee (2017) as lower proficiency students (i.e. ELE students in this study) display greater sensitivity to the pressure from satisfying the English graduation benchmark policy and tend to suffer an increased level of anxiety when taking ELPTs. The results collaborate that students suffer from a huge amount of reportedly negative influence, particularly anxiety and stress on the high-stakes tests as pointed in previous studies (e.g. Chen, 2012; Chen & Squires, 2010; Damer & Melendres, 2011; Ryan & Weinstein, 2009; Tsai & Tsou, 2009; Woodrow, 2011). As a consequence, they encounter more difficulties in learning English and gaining interest (Kim, 2009). According to MacIntyre (1999), ‘the combination of high levels of anxiety and low self-rated proficiency creates students with low levels of linguistic self-confidence, which reduces motivation for study and communication in the second language’ (p.41). For INT and UPP students in NOR and CEN, they were less anxious because anxiety decreases when proficiency increases (Gardner, Smythe, & Brunet, 1977).

Concerning factor 3 ‘Test-oriented learning’, significant differences were found in each region. In NOR, the scores of UPP students were higher than those of ELE and INT students (p = 0.094, p = 0.009), indicating that UPP students suffered from more test-oriented learning than ELE and INT students. INT and UPP students in SOU were also found to gain more test-oriented learning than ELE students (p = 0.052, p = 0.006). These findings are in line with the Ewald’s (2007) study as higher proficiency students are much more influenced by test-oriented learning. Accordingly, those whose English proficiency levels are higher (i.e. UPP students in NOR, INT and UPP students in SOU) learn English and perceive the policy to get ELPTs more than a requirement they need to fulfil, whereas lower proficiency students (i.e. INT and ELE in NOR, ELE in SOU) regard the English graduation benchmark policy as a mere requirement to graduate. As a result, higher proficiency students are motivated to learn test-taking strategies and skills (Gordon & Reese, 1997).

Conclusions and implications

As the first attempt to draw a large sample of tertiary students in NOR, CEN and SOU, this study sought to explore students’ voices on the English graduation benchmark policy by analysing their perceptions across the three dimensions (i.e. benefits, anxiety and test-oriented learning). Without any doubts, the setting of the English graduation benchmark policy, as many studies have shown, helps to improve Vietnamese students limited English proficiency. However, the policy also exerts negative impacts on students’ learning including anxiety and test-oriented learning.

In this research, students in NOR, CEN and SOU showed variations in their perceptions of the policy across the first two factors, factor 1 ‘Benefits’ and factor 2 ‘Anxiety’, respectively. Regarding benefits, a major difference was identified between SOU and CEN students. Those living in SOU were more supportive for the policy than CEN students because it generated their learning motivation, improved English proficiency, directed learning and enhanced competitiveness in both future workplace and further education. This reflects the fact that students from developing and urban contexts acknowledge the ELPTs, the policy and English language skills as essential assets for their future. For the factor 2 ‘Anxiety’, there were also varying responses from NOR, CEN and SOU students, placing NOR in the first, CEN in the second and SOU in the last for the levels of anxiety. This variation may be interpreted as being due to the heavy influence of Confucianism and socio-economic development, NOR and CEN students were under more pressure and anxiety compared to SOU counterparts. For students studying in CEN, their English proficiency is lower because of difficulties in socio-economic development and less opportunities for English language learning. Their motivation, as stated by Trang and Baldauf Jr (2007), comes from the fact that they are required to learn English. Moreover, seeking stable jobs in CEN cities or provinces (e.g. Da Nang, Hue or Vinh) is highly competitive, placing them under pressure if they do not have a good command of English proficiency and not achieve satisfactory results in ELPTs for their graduation. Unsuccessfully graduate from their HEIs or unsuccessfully find a job imposes burden on their future. For students in NOR, their concerns about level of anxiety caused by ELPTs and English graduation benchmark policy is similar with CEN students because of their strong economic desire and pressure of failure. The research also attempted to explore whether the levels of English proficiency did create differences in NOR, CEN and SOU students’ voices on the English graduation benchmark policy. The results, interestingly, showed variations among students’ English proficiency levels in each region.

These findings, gathered from both benefits and anxiety related to the implementation of the English graduation benchmark policy in Vietnam, emphasise a critical demand for collaboration among test stakeholders, for example, test developers, policy makers, teachers and educational practitioners. Cheng (2005) posits that their engagement and understanding contribute not only to reduce the negative effects including anxiety, stress, pressure and test-oriented learning, but also bring about the intended positive effects. The MOET should conduct more research by gathering opinions from different stakeholders including HEIs, administrators, teachers, students and employers to evaluate how the English graduation benchmark policy has been implemented. These efforts would notice the views of NOR, CEN and SOU students to ensure that the policy of using ELPTs reflects upon the real-life situations (McNamara, 2012; McNamara & Ryan, 2011).

In addition, there are heterogeneous characteristics ranging from cultures, English proficiency to English learning conditions to socio-economic backgrounds in the three regions, these results go further to question the effectiveness of this one-size-fits-all policy in Vietnam. If the heterogeneity is not taken into consideration, this one-size-fits-all policy is continuously confronted with some inherent problems, such as unfairness, test-driven orientation in teaching and learning English, and the increasing source of dissatisfaction. As there has been no consistency and consensus in English teaching and learning contents at tertiary level, it might be unfair to determine whether students will graduate or still stay on at the HEI simply on the scores they gain from the ELPTs. Even though the purpose of adopting ELPTs is to foster English language competence for Vietnamese future workforce, it may be advisable that the MOET should analyse and adopt different tools to assess the English learning outcomes of students having lower levels of English proficiency and coming from remote and mountainous areas.

Increasing the quality of teaching and learning in Vietnam is another possible approach to ensure the successful implementation of English graduation benchmark policy. More efforts should be made to narrow the current gaps associated with English language education among different regions in Vietnam. If Vietnamese students’ overall levels of English proficiency get significantly improved, they might become more confident and less anxious in learning English and taking ELPTs.

The quantitative nature of this research might be limited in examining more insights about students’ voices on the English graduation benchmark policy; therefore, future studies are recommended to additionally collect qualitative data in the forms of semi-structured interviews. In addition, this research gathered participants’ self-reported English proficiency results in IELTS and TOEIC to categorise them into three levels of English proficiency. As IELTS and TOEIC are fundamentally different from each other, future research can use only one ELPT to analyse how students with different proficiency levels perceive the English graduation benchmark policy.

Availability of data and materials

The datasets used and/or analysed during the current study are available from the corresponding author on reasonable request.



Analysis of variance


Average variance extracted


Common European Reference for Languages


Central Vietnam


Comparative fit index


Construct reliability


Exploratory factor analysis




English language proficiency test


First Certificate in English


Higher education institution


International English Language Testing System




Multivariate of variance


Ministry of Education and Training


National Foreign Language Project


Northern Vietnam


Preliminary English Test


Root mean square error of approximation


Southern Vietnam


Tucker-Lewis Index


Test of English as a Foreign Language


Test of English for International Communication


Upper intermediate


Vietnam Foreign Language Framework


  1. Ang, R. P., & Huan, V. S. (2006). Academic expectations stress inventory: Development, factor analysis, reliability, and validity. Educational and Psychological Measurement, 66(3), 522–539.

    Article  Google Scholar 

  2. Ang, R. P., Klassen, R. M., Chong, W. H., Huan, V. S., Wong, I. Y., Yeo, L. S., & Krawchuk, L. L. (2009). Cross-cultural invariance of the academic expectations stress inventory: Adolescent samples from Canada and Singapore. Journal of adolescence, 32(5), 1225–1237.

    Article  Google Scholar 

  3. Anney, V. N. (2014). Ensuring the quality of the findings of qualitative research: Looking at trustworthiness criteria. Journal of Emerging Trends in Educational Research and Policy Studies (JETERAPS), 5(2), 272–281.

    Google Scholar 

  4. Bailey, K. M. (1996). Working for washback: A review of the washback concept in language testing. Language testing, 13(3), 257–279.

    Article  Google Scholar 

  5. Barksdale-Ladd, M. A., & Thomas, K. F. (2000). What’s at stake in high-stakes testing: Teachers and parents speak out. Journal of teacher Education, 51(5), 384–397.

    Article  Google Scholar 

  6. Baurain, B. (2010). Course design and teacher development in Vietnam: A diary project. TESOL Journal, 1(1), 159–175.

    Article  Google Scholar 

  7. Brown, G. A., Bull, J., & Pendlebury, M. (2013). Assessing student learning in higher education. Routledge.

  8. Brown, H. D., & Abeywickrama, P. (2010). Principles of language assessment (pp. 25–51). Language Assessment: Principles and classroom practices.

    Google Scholar 

  9. Brown, T. A. (2006). Confirmatory factor analysis for applied research. New York, NY, US: The Guilford Press.

  10. Bui, T. T. N., & Nguyen, H. T. M. (2016). Standardizing english for educational and socio-economic betterment-a critical analysis of english language policy reforms in Vietnam. In English language education policy in Asia (pp. 363–388). Cham: Springer.

    Google Scholar 

  11. Cavendish, W., Márquez, A., Roberts, M., Suarez, K., & Lima, W. (2017). Student engagement in high-stakes accountability systems. Penn GSE Perspectives on Urban Education, 14(1), n1.

    Google Scholar 

  12. Chapelle, C. A., Enright, M. K., & Jamieson, J. M. (Eds.). (2011). Building a validity argument for the test of English as a Foreign LanguageTM. Routledge.

  13. Chen, H. (2012). The moderating effects of item order arranged by difficulty on the relationship between test anxiety and test performance. Creative Education, 3(03), 328.

    Article  Google Scholar 

  14. Chen, M. L., & Squires, D. (2010). Vocational college students’ perceptions on standardized english proficiency tests. The Asian EFL Journal, 12(2), 68–91.

    Google Scholar 

  15. Chen, S., & Tsai, Y. (2012). Research on English teaching and learning: Taiwan (2004–2009). Language Teaching, 45(2), 180–201.

    Article  Google Scholar 

  16. Chen, S. W., & Liu, G. Z. (2007). The impact of English exit policy on motivation and motivated learning behaviors. In Proceedings of the twenty-fourth international conference on English teaching and learning in the Republic of China (pp. 104–110).

    Google Scholar 

  17. Cheng, L. (2005). Changing language teaching through language testing: A washback study (Vol. 21). Cambridge University Press.

  18. Cheng, L., & Curtis, A. (2012). Test impact and washback: Implications for teaching and learning. Cambridge guide to second language assessment, 89–95.

  19. Cheng, L., & Watanabe, Y. (Eds.). (2004). Washback in language testing: Research contexts and methods. Routledge.

  20. Ching-Ni Hsieh, (2017) The Case of Taiwan: Perceptions of College Students About the Use of the Tests as a Condition of Graduation. ETS Research Report Series (1):1–12.

    Article  Google Scholar 

  21. Chong, D. S., & Kim, H. D. (2001). A study for the development of a university-level general English course. English Teaching, 56(4), 265–292.

    Google Scholar 

  22. Chu, H. Y., & Yeh, H. N. (2017). English benchmark policy for graduation in Taiwan’s higher education: Investigation and reflection. Journal of Language Teaching and Research, 8(6), 1063–1072.

    Article  Google Scholar 

  23. Chu, H. Y. (2009). Stakes, needs and washback: An investigation of the English benchmark policy for graduation and EFL education at two technological universities in Taiwan(Doctoral dissertation, Doctoral dissertation, National Taiwan Normal University). Retrieved from

  24. Churchill, G. A., Jr. (1979). A paradigm for developing better measures of marketing constructs. Journal of marketing research, 16(1), 64–73.

    Article  Google Scholar 

  25. Damer, D. E., & Melendres, L. T. (2011). “Tackling test anxiety”: A group for college students. The Journal for Specialists in Group Work, 36(3), 163–177.

    Article  Google Scholar 

  26. Deci, E. L., & Ryan, R. M. (2016). Optimizing students’ motivation in the era of testing and pressure: A self-determination theory perspective. In Building autonomous learners (pp. 9–29). Singapore: Springer.

    Google Scholar 

  27. Desveaux, S. (2013). Guided learning hours. In Cambridge English Language Assessment (accessed 10 June 2019.

    Google Scholar 

  28. Dickinson, L. (1995). Autonomy and motivation a literature review. System, 23(2), 165–174.

    Article  Google Scholar 

  29. Dörnyei, Z. (2009). The L2 motivational self-system. Motivation, language identity and the L2 self, 36(3), 9–11.

    Article  Google Scholar 

  30. Dudzik, D. L., & Nguyen, Q. T. N. (2015). Vietnam: Building English competency in preparation for ASEAN 2015. ASEAN integration and the role of English language teaching, 41–71.

  31. Ewald, J. D. (2007). Foreign language learning anxiety in upper-level classes: Involving students as researchers. Foreign Language Annals, 40(1), 122–142.

    Article  Google Scholar 

  32. Fraering, M., & Minor, M. S. (2006). Sense of community: An exploratory study of US consumers of financial services. International Journal of Bank Marketing 24(5), 284–306.

  33. Field, A. (2005). Discovering statistics using SPSS for windows. SAGE Publications Ltd, 816.

  34. Fornell, C., & Larker, D. (1981). Structural equation modeling and regression: guidelines for research practice. Journal of Marketing Research, 18(1), 39–50.

  35. Gall, M. D., Gall, J. P., & Borg, W. R. (2007). Educational research: An introduction (8th ed.). Boston: Pearson/Allyn & Bacon.

    Google Scholar 

  36. Gan, Z., Humphreys, G., & Hamp-Lyons, L. (2004). Understanding successful and unsuccessful EFL students in Chinese universities. The modern language journal, 88(2), 229–244.

    Article  Google Scholar 

  37. Gardner, R. C., Smythe, P. C., & Brunet, G. R. (1977). Intensive second language study: effects on attitudes, motivation and french achievement 1. Language learning, 27(2), 243–261.

    Article  Google Scholar 

  38. Gerbing, D. W., & Anderson, J. C. (1988). An updated paradigm for scale development incorporating unidimensionality and its assessment. Journal of marketing research, 25(2), 186–192.

    Article  Google Scholar 

  39. Ghaith, G., & Diab, H. (2008). Determinants of EFL achievement among Arab college-bound learners. Education, Business and Society: Contemporary Middle Eastern Issues, 1(4), 278–286.

    Article  Google Scholar 

  40. Gipps, C. (2002). Sociocultural perspectives on assessment. In Learning for life in the 21st century: Sociocultural perspectives on the future of education (pp. 73–83).

    Google Scholar 

  41. Gordon, S. P., & Reese, M. (1997). High-stakes testing: worth the price? Journal of school leadership, 7(4), 345–368.

    Article  Google Scholar 

  42. Hair, J. F., Anderson, R. E., Tatham, R. L., & William, C. (1998). Black (1998). Multivariate data analysis, 5, 87–135.

    Google Scholar 

  43. Hair, J. F., Ringle, C. M., & Sarstedt, M. (2011). PLS-SEM: Indeed a silver bullet. Journal of Marketing theory and Practice, 19(2), 139–152.

    Article  Google Scholar 

  44. Hoa, N. T. T., & Mai, P. T. T. (2016). Difficulties in teaching English for specific purposes: Empirical study at Vietnam universities. Higher Education Studies, 6(2), 154–161.

    Article  Google Scholar 

  45. Hoang, V. V. (2010). The current situation and issues of the teaching of English in Vietnam. Ritsumeikan studies in language and culture, 22(1), 7–18.

    Google Scholar 

  46. Hsu, W. (2009). Measuring the vocabulary of college general English textbooks and English-medium textbooks of business core courses. Electronic Journal of Foreign Language Teaching, 6(2), 126–149.

    Google Scholar 

  47. Hsu, Y. C. (2004). A study on junior college students’ reading anxiety in English as a foreign language. Chiayi, Taiwan: Unpublished master’s thesis, National Chung Cheng University.

    Google Scholar 

  48. Huan, V. S., See, Y. L., Ang, R. P., & Har, C. W. (2008). The impact of adolescent concerns on their academic stress. Educational Review, 60(2), 169–178.

    Article  Google Scholar 

  49. In’nami, Y. (2006). The effects of test anxiety on listening test performance. System, 34(3), 317–340.

    Article  Google Scholar 

  50. Kaiser, H. F. (1974). An index of factorial simplicity. Psychometrika, 39(1), 31–36.

    Article  Google Scholar 

  51. Kim, K. J. (2009). Demotivating factors in secondary English education. English Teaching, 64(4), 249–267.

    Article  Google Scholar 

  52. Kim, T. Y. (2010). Socio-political influences on EFL motivation and attitudes: Comparative surveys of Korean high school students. Asia Pacific Education Review, 11(2), 211–222.

    Article  Google Scholar 

  53. Kim, T. Y., & Kim, Y. K. (2016). A quasi-longitudinal study on English learning motivation and attitudes: The case of South Korean students. The Journal of Asia TEFL, 13(2), 138–155.

    Google Scholar 

  54. Kormos, J., & Kiddle, T. (2013). The role of socio-economic factors in motivation to learn English as a foreign language: The case of Chile. System, 41(2), 399–412.

    Article  Google Scholar 

  55. Lamb, M. (2012). A self-system perspective on young adolescents’ motivation to learn English in urban and rural settings. Language learning, 62(4), 997–1023.

    Article  Google Scholar 

  56. Lan, N. T. (2017). Exploring the washback effects of VSTEP on the teaching of English at ULIS-VNU. VNU Journal of Foreign Studies, 33(4), 122–136.

    Article  Google Scholar 

  57. Le Ha, P. (2009). English as an international language: International student and identity formation. Language and intercultural communication, 9(3), 201–214.

    Article  Google Scholar 

  58. Le, V. C. (2017). English language education in Vietnamese universities: National benchmarking in practice. In E. S. Park & B. Spolsky (Eds.), English education at the tertiary level in Asia: From theory to practice (pp. 283–292). New York: Routledge.

    Google Scholar 

  59. Lee, E. (2005). The relationship of motivation and flow experience to academic procrastination in university students. The Journal of Genetic Psychology, 166(1), 5–15.

    Article  Google Scholar 

  60. Li, H., Ang, R. P., & Lee, J. (2008). Anxieties in Mainland Chinese and Singapore Chinese adolescents in comparison with the American norm. Journal of Adolescence, 31(5), 583–594.

    Article  Google Scholar 

  61. Li, W., & Li, Y. (2010). An analysis on social and cultural background of the resistance for China’s education reform and academic pressure. International Education Studies, 3(3), 211–215.

    Google Scholar 

  62. Liauh, Y-H. (2011). A study of the perceptions of English faculty and students of exit English examinations at Taiwan’s technological and vocational higher education institutions. Unpublished doctoral dissertation, University of Montana Missoula, MT.

  63. Liu, H. J. (2012). Understanding EFL undergraduate anxiety in relation to motivation, autonomy, and language proficiency. Electronic Journal of Foreign Language Teaching, 9, 1.

    Google Scholar 

  64. Luo, W., Paris, S. G., Hogan, D., & Luo, Z. (2011). Do performance goals promote learning? A pattern analysis of Singapore students’ achievement goals. Contemporary Educational Psychology, 36(2), 165–176.

    Article  Google Scholar 

  65. MacIntyre, P. D. (1999). Language anxiety: A review of the research for language teachers. Affect in foreign language and second language learning: A practical guide to creating a low-anxiety classroom atmosphere, 24, 41.

    Google Scholar 

  66. McNamara, T. (2012). Language assessments as shibboleths: A poststructuralist perspective. Applied linguistics, 33(5), 564–581.

    Article  Google Scholar 

  67. McNamara, T., & Ryan, K. (2011). Fairness versus justice in language testing: The place of English literacy in the Australian citizenship test. Language Assessment Quarterly, 8(2), 161–178.

    Article  Google Scholar 

  68. Na, Z. (2007). A study of high school students’ English learning anxiety. The Asian EFL Journal, 9(3), 22–34.

    Google Scholar 

  69. Ngoc, K. M., & Iwashita, N. (2012). A comparison of learners’ and teachers’ attitudes towards communicative language teaching at two universities in Vietnam. University of Sydney Papers in TESOL, 7, 25–49 Retrieved from

    Google Scholar 

  70. Nguyen, H. T. (2018). English-medium-instruction management: The missing piece in the internationalisation puzzle of Vietnamese higher education. In Internationalisation in Vietnamese Higher Education (pp. 119–137). Cham: Springer.

    Google Scholar 

  71. Nguyen, H. T., Fehring, H., & Warren, W. (2015). EFL teaching and learning at a Vietnamese University: What do teachers say? English Language Teaching, 8(1), 31–43.

    Google Scholar 

  72. Nguyen, H. T. M., & Bui, T. (2016). Teachers’ agency and the enactment of educational reform in Vietnam. Current Issues in Language Planning, 17(1), 88–105.

    Article  Google Scholar 

  73. Nguyen, H. T. M., Nguyen, H. T., Van Nguyen, H., & Nguyen, T. T. T. (2018). 12 Local challenges to global needs in English language education in Vietnam: The perspective of language policy and planning. In Un (intended) Language Planning in a Globalising World: Multiple Levels of Players at Work (pp. 214–233). Sciendo Migration.

  74. Nguyen, N. H. (2013). Báo cáo về định hướng công tác thi, kiểm tra, đánh giá tiếng Anh và các môn ngoại ngữ trong hệ thống giáo dục quốc dân giai đoạn 2013–2020. Report on orientations in testing and assessment of English and other foreign languages in the national education system during 2013–2020. In Seminar on Strategies of the National Foreign Language Project (Vol. 2020, pp. 2014–2020).

    Google Scholar 

  75. Nhan, T. (2013). The TOEIC® test as an exit requirement in universities and colleges in Danang City, Vietnam: Challenges and impacts. International Journal of Innovative Interdisciplinary Research, 2(1), 33–50.

    Google Scholar 

  76. Pan, L., & Block, D. (2011). English as a “global language” in China: An investigation into learners’ and teachers’ language beliefs. System, 39(3), 391–402.

    Article  Google Scholar 

  77. Pan, Y. (2009). Voices in the field: An interview with Jessica Wu. SHIKEN: The Japan Association of Language Teaching, Testing & Assessment SIG Newsletter, 13, 9–14.

    Google Scholar 

  78. Pan, Y., & Newfields, T. (2011). Teacher and student washback on test preparation evidenced from Taiwan’s English certification exit requirements. International Journal of Pedagogies and Learning, 6(3), 260–272.

    Article  Google Scholar 

  79. Pan, Y. C. (2015). Test impact: English certification exit requirements in Taiwan. TEFLIN Journal, 20(2), 119–139.

    Google Scholar 

  80. Pan, Y. C., & Newfields, T. (2012). Tertiary EFL proficiency graduation requirements in Taiwan: A study of washback on learning. Electronic Journal of Foreign Language Teaching, 9(1).

  81. Phuong, H. Y. (2017). Improving English language teaching in Vietnam: Voices from university teachers and students. Current Politics and Economics of South, Southeastern, and Central Asia, 26(3), 285–310.

    Google Scholar 

  82. Phuong, L. N. T., & Nhu, T. P. (2015). Innovation in English language education in Vietnam for ASEAN 2015 Integration: Current issues, challenges, opportunities, investments and solutions. ASEAN Integration and Role of ELT, 104.

  83. Popham, W. J. (2011). Assessment literacy overlooked: A teacher educator’s confession. The Teacher Educator, 46(4), 265–273.

    Article  Google Scholar 

  84. Popham, W. J. (2014). The right test for the wrong reason. Phi Delta Kappan, 96(1), 46–52.

    Article  Google Scholar 

  85. Roderick, M., & Engel, M. (2001). The grasshopper and the ant: Motivational responses of low-achieving students to high-stakes testing. Educational Evaluation and Policy Analysis, 23(3), 197–227.

    Article  Google Scholar 

  86. Ryan, R. M., & Deci, E. L. (2000). Self-determination theory and the facilitation of intrinsic motivation, social development, and well-being. American psychologist, 55(1), 68.

    Article  Google Scholar 

  87. Ryan, R. M., & Weinstein, N. (2009). Undermining quality teaching and learning: A self-determination theory perspective on high-stakes testing. School Field, 7(2), 224–233.

    Google Scholar 

  88. Sakai, H., & Kikuchi, K. (2009). An analysis of demotivators in the EFL classroom. System, 37(1), 57–69.

    Article  Google Scholar 

  89. Shih, C. M. (2007). A new washback model of students’ learning. Canadian Modern Language Review, 64(1), 135–161.

    Article  Google Scholar 

  90. Shih, C. M. (2008). The general English proficiency test. Language Assessment Quarterly, 5(1), 63–76.

    Article  Google Scholar 

  91. Shih, C. M. (2012). Policy analysis of the English graduation benchmark in Taiwan. Perspectives in Education, 30(3), 60–68.

  92. Shih, P. C. (2013). The English Benchmark Policy for Graduation: An Investigation of Perception, Motivation, and Approaches to Learning at a University of Technology in Central Taiwan (Doctoral dissertation, Durham University).

  93. Shohamy, E. (2014). The power of tests: A critical perspective on the uses of language tests. Routledge.

  94. Shohamy, E., Donitsa-Schmidt, S., & Ferman, I. (1996). Test impact revisited: Washback effect over time. Language testing, 13(3), 298–317.

    Article  Google Scholar 

  95. Skutnabb-Kangas, T., Phillipson, R., Panda, M., & Mohanty, A. (2009). MLE concepts, goals, needs and expense: English for all or achieving justice. Multilingual education for social justice: Globalising the local, 313–334.

  96. Stecher, B. (2010). Performance assessment in an era of standards-based educational accountability. Standford Center for Opportunity Policy in Education.

  97. Su, M. H. M. (2005). A study of EFL technological and vocational college students’ language learning strategies and their self-perceived English proficiency. Electronic Journal of Foreign Language Teaching, 2(1), 44–56.

    Google Scholar 

  98. Su, Y. C. (2006). EFL teachers’ perceptions of English language policy at the elementary level in Taiwan. Educational Studies, 32(3), 265–283.

    Article  Google Scholar 

  99. ten Holt, J. C., van Duijn, M. A., & Boomsma, A. (2010). Scale construction and evaluation in practice: A review of factor analysis versus item response theory applications. Psychological Test and Assessment Modeling.

    Google Scholar 

  100. Ton, N. N. H., & Pham, H. H. (2010). Vietnamese teachers’ and students’ perceptions of global English. Language Education in Asia, 1(1), 48–61.

    Article  Google Scholar 

  101. Tran, L. T., & Marginson, S. (2018). Internationalisation of Vietnamese higher education: An overview. In Internationalisation in Vietnamese higher education (pp. 1-16). Springer, Cham.

  102. Tran, T. T. (2013). Factors affecting teaching and learning English in Vietnamese universities. The Internet journal language, culture and society, 38(1), 138–145.

    Google Scholar 

  103. Trang, T. T. T., & Baldauf, R. B., Jr. (2007). Demotivation: Understanding resistance to English language learning-the case of Vietnamese students. The journal of Asia TEFL, 4(1), 79–105.

    Google Scholar 

  104. Tsagari, D., & Cheng, L. (2016). Washback, impact, and consequences revisited. Language Testing and Assessment, 1–14.

  105. Tsai, Y., & Tsou, C. H. (2009). A standardised English language proficiency test as the graduation benchmark: Student perspectives on its application in higher education. Assessment in Education: Principles, Policy & Practice, 16(3), 319–330.

    Google Scholar 

  106. Tsui, A. P. Y., & Ngo, H. Y. (2017). Students’ perceptions of English-medium instruction in a Hong Kong university. Asian Englishes, 19(1), 57–78.

    Article  Google Scholar 

  107. Van Canh, L., & Barnard, R. (2009). A survey of Vietnamese EAP teacher’s beliefs about grammar teaching. In L. J. Zhang, R. Rubdy, & L. Alsagoff (Eds.), Englishes and Literatures-in-English in a Globalised World: Proceedings of the 13th International Conference on English in Southeast Asia (pp. 246–259). Singapore: National Institute of Education, Nanyang Technological University.

    Google Scholar 

  108. Van Canh, L., & Renandya, W. A. (2017). Teachers’ English proficiency and classroom language use: A conversation analysis study. RELC journal, 48(1), 67–81.

    Article  Google Scholar 

  109. Van Huy, N., & Hamid, M. O. (2015). Educational policy borrowing in a globalized world: A case study of Common European Framework of Reference for languages in a Vietnamese University. English Teaching: Practice & Critique, 14(1), 60–74.

    Google Scholar 

  110. Wall, D. (2013). Washback. In The Routledge handbook of language testing (pp. 93–106). Routledge.

  111. Wang, Y. H., & Liao, H. C. (2012). Anxiety of university students in Taiwan about the General English Proficiency Test. Social Behavior and Personality: an international journal, 40(1), 63–74.

    Article  Google Scholar 

  112. Wong, J., Salili, F., Ho, S. Y., Mak, K. H., Lai, M. K., & Lam, T. H. (2005). The perceptions of adolescents, parents and teachers on the same adolescent health issues. School Psychology International, 26(3), 371–384.

    Article  Google Scholar 

  113. Woodrow, L. (2011). College English writing affect: Self-efficacy and anxiety. System, 39(4), 510–522.

    Article  Google Scholar 

  114. Wu, J., & Lee, M. C. L. (2017). The relationships between test performance and students’ perceptions of learning motivation, test value, and test anxiety in the context of the English benchmark requirement for graduation in Taiwan’s universities. Language Testing in Asia, 7(1), 9.

    Article  Google Scholar 

  115. Wu, J. R. (2012). GEPT and English language teaching and testing in Taiwan. Language Assessment Quarterly, 9(1), 11–25.

    Article  Google Scholar 

  116. Xiang-dong, G. U. (2007). An empirical study of CET washback on College English teaching and learning in China. Journal of Chongqing University (Social Science Edition), 4.

  117. Xie, Q., & Andrews, S. (2013). Do test design and uses influence test preparation? Testing a model of washback with Structural Equation Modeling. Language Testing, 30(1), 49–70.

    Article  Google Scholar 

  118. Zhan, Y., & Andrews, S. (2014). Washback effects from a high-stakes examination on out-of-class English learning: Insights from possible self-theories. Assessment in Education: Principles, policy & practice, 21(1), 71–89.

    Google Scholar 

Download references


We are grateful to both anonymous reviewers for their insightful comments on the paper, as these comments led us to an improvement of the work.


No funding was received from any specific funding agencies.

Author information




TNP served as the principal investigator, conducting the main study and performing the data analysis. He was a major contributor in writing the manuscript. LTPB participated in the data collection and wrote relevant literature and data analysis. Both authors read and approved the final manuscript.

Corresponding author

Correspondence to Thinh Ngoc Pham.

Ethics declarations

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (, which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Pham, T.N., Bui, L.T.P. An exploration of students’ voices on the English graduation benchmark policy across Northern, Central and Southern Vietnam. Lang Test Asia 9, 15 (2019).

Download citation


  • English graduation benchmark policy
  • washback
  • language policy
  • English language proficiency tests
  • teaching and learning English in Vietnam