Open Access

Assessing English speaking skills of prospective teachers at entry and graduation level in teacher education program

Language Testing in Asia20144:5

https://doi.org/10.1186/2229-0443-4-5

Received: 22 February 2014

Accepted: 17 April 2014

Published: 8 May 2014

Abstract

The development of Spoken English in Pakistani teachers training institutions working in public sector has never been assessed. This study explores the extent of improvement in English speaking skills among prospective teachers of one year teacher education program at three public sector universities in Punjab, Pakistan, where English is the medium of instruction, like all other public sector universities in Pakistan. The sample for this study consisted of 206 prospective teachers (131 entry and 75 graduation level). The unequal number of prospective teachers at entry and graduation level was due to difference in intake. The prospective teachers were called one by one and were assessed using the Fairfax County Rating Scale. The data were analyzed quantitatively. It was concluded that no significant improvement occurs in the English speaking skills of the prospective teachers during the teacher education program offered in departments of Education working in the public sector universities.

Keywords

AssessmentEnglish speaking skillsTeacher education

Introduction

Language is a system of arbitrary symbols for human beings’ communication (Knight,1992). English being the official language occupies an important position in Pakistan (Khushi & Talaat,2011). This is also the language of teacher education in Pakistan (Rahman,2006). A language has four basic skills: reading, writing, listening and speaking. Learning a language needs to learn these four basic skills. Learning in Pakistani universities is assessment driven (Ali et al.2009). Instead of assessing four basic skills only writing skills are assessed in the public sector educational institutions including universities in Pakistan (Alam,2012; Coleman,2010). The assessment of writing skills alone gives high grades and students work hard for mastery in writing excellent pieces. English Speaking Skills (ESS) have rarely been assessed. That is why students do not pay attention to English speaking skills. Consequently, the level of English speaking proficiency is very low in outgoing graduates of higher education in Pakistan (Alam,2012; Bilal et al.2013; Coleman,2010; Shahzad et al.2011). Not to talk about the weak students even bright students who get high scores in written examinations are unable to speak English language properly (Bilal et al.,2013; Karim,2012). Speaking skills empower human to create new ways of speaking to others about any topic or experience (Honig,2007; Huang,2012; Wu,2012). To speak fluently, correctly with proper intonation and pronunciation especially in the second or foreign language adds to credit of the speaker (Lindblad,20112011). The excellence in the use of the ability to speak in second language makes the speaker a skilful communicator (Sarwar et al.2012). Effective Communication skill is one of the standards for the teachers in many countries of world (Aslam,2011; Cammarata,2010; Government-of-Pakistan,2009).

The situation demands research to explore the entry and graduation level of prospective teachers’ English spoken skills and value added if any. Surprisingly, no study was available to provide evidence about the entry level and graduation level of prospective teachers’ English spoken skills. The study is an effort to fill the gap in research. The study tried to explore the entry and graduation level of English speaking kills of prospective teachers in Pakistan. Then study will further provide evidence about value added English speaking skills during teacher education. The evidence provided by the study may be helpful to teacher educators and administrators in their efforts to improve English speaking skills of prospective teachers. The evidence may be used to improve entry requirements, curriculum and instruction, and assessment criteria for teacher education programs.

Developing speaking skills

Second language follows the same pattern of learning as the first language follows: preproduction (the learner only listens), early production (can use short language chunks), speech emergence (they try to initiate short conversation with friends), intermediate fluency and advanced fluency (the students are nearly-native in their ability) (Urlaub et al.2010). Use of target language to talk about language is the best strategy for learning spoken language (Maguire et al.,2010). But in Pakistan apart from external constraints teachers do not attain sufficient oral English proficiency during teacher education program (Bilal et al.,2013; Karim,20122011; Sarwar et al.,2012; Tariq et al.2013). Teacher education programs need to be strengthened for effective oral English instruction and assessment (Wedell,2008).

Assessment of English speaking skills

Assessment is an activity that engages both students and teachers in judgment about the quality of student achievement or performance, and inferences about the learning that has taken place (David Boud & Falchikov,2006; Sadler,2005). Second language assessment is done either to gauge a participant’s actual level of competence/proficiency or to assess language development over a period of time (Alam,2012; Bruton,2009). Assessment does have an impact on the students’ approach to learning. The nature of the assessment determines the learning behaviour of the students as well as the teaching behaviour of teachers. Strong impact of assessment on the language learning process has been noted by a large number of researchers like (Crooks,1988; Heywood,19891983). There are many challenges in the assessment of oral skills in a second–language including: defining language proficiency, avoiding cultural biases, and attaining validity (Sánchez,2006). Assessment of speaking skills often lags far behind the importance given to teaching those skills in the curriculum (Knight,1992). Assessment drives university teaching in Pakistan. During the teaching learning process, the orientation of both the teacher and students remains towards assessment (Ali et al.,2009). The grading system is based only on achievement scores. So, the teachers, students, administration and other stakeholders focus only on the areas of the syllabi that bring good credit to them in terms of achievement scores in examinations. If assessment is limited to written examinations then the students will only learn how to write (Ahmad,2011; Akiyama,2003; Ali et al.2012).

Challenges about assessment of spoken English

The use of oral assessment motivates students to practice and improve their English speaking skills (Huang,2012; Huxham et al.2012; Lee,2007). In spite of all these benefits the experts in Pakistan are facing the problem of finding experts in assessing spoken skills in English (Ahmad,2011). This situation is mainly due to three reasons: insufficient training, lack of public trust on oral assessment and issues of test validity. The teachers are not properly trained to conduct oral assessments in Pakistan. The teachers are either reluctant to test oral ability or lack confidence in the validity of their assessments (Knight,1992). The lack of public trust on oral examination makes the situation more complex (Bashir,2011). Validity has been identified as the most important quality of tests, which concerns the extent to which meaningful inferences can be drawn from test scores (Best & Kahn,2005). Like other tests spoken skills tests need to ensure seven test qualities namely: reliability, validity, authenticity, inter-activeness, impact, practicality, and absence of bias (Akiyama,2003; Bilal et al.,2013; Lee,2007).

The purpose of this study was to assess the development of English speaking skills among prospective teachers undertaking one year teacher education program in the public sector universities of the Punjab, Pakistan.

Methods and procedures

This study is descriptive in nature as it describes and interprets conditions and relationships that exist (Best & Kahn,2005). The purpose of the study was to assess the development in oral proficiency in English language among prospective teachers undertaking one year teacher education program in Punjab, Pakistan, takes place or not in case the medium instruction and examination is English and English speaking skills are neither assessed/ evaluated nor given any credit in terms marks. Moreover, the purpose of the study was to assess whether prospective teachers of one year teacher education program improve their level of proficiency in English speaking skills without the inclusion of any Speaking Module/ course in the syllabus, despite the fact that all courses are taught and assessed in English. Assessment studies include surveys, educational assessment, activity analysis and trend studies (Alam,20122005; D. Boud,1990; Heywood,1989). The present study could be conducted using cross-sectional and longitudinal designs. Research design is the plan and structure of investigation, which expresses both the structure of the research problem and the plan of investigation used to obtain empirical evidence (Cohen et al.2011; William,2009). In cross-sectional design data are collected from selected individuals at a single point in time while longitudinal design involves multiple measures over an extended period of time (Gay,2008). Keeping the limited time for completion of the study cross-sectional deign was selected. This study offers a snapshot of a single moment in time; it does not consider what happens before or after the snapshot is taken (Breakwell et al.2010). Oral proficiency can be assessed by the use of rubrics (Allen & Tanner,2006). Rubrics are of two types: holistic and analytic. Analytic rubrics are preferred when more accuracy is required (Montgomery,2002). Most of the international assessment of speaking skills makes use of analytic rubrics. The European Common Framework for Language Testing, IELTS, TOEFL etc. measure the speaking proficiency through analytic rubrics.

Population and Sample

The population was all of the prospective teachers of one year teacher training program of public sector universities of the Punjab, Pakistan. The accessible population was all prospective teachers in the public sector universities of the Punjab who were prospective teachers in spring, 2012. The sample for this study consisted of 206 university prospective teachers (131 entry and 75 graduation level) from three universities: University of Sargodha, Government College University, Faisalabad and University of the Punjab, Lahore. The researchers used a multi-stage sampling technique. There were 19 universities in Punjab in the public sector. The universities can further be subdivided into old and new universities. The researchers selected one university from the older universities and two from relatively new universities. The University of Sargodha and Government College University, Faisalabad were selected from new universities and University of the Punjab was selected from older universities. At the second stage all available prospective teachers of one year teacher education program were selected from the University of Sargodha and Government College University, Faisalabad. In case of University of the Punjab, there were multiple classes of Master of Arts in Education, so two classes (one class from each of semester-I and semester-II) were randomly selected for data collection. The sample consisted of predominantly females with few males as there are only a few males registered in Master of Arts in Education in almost all public sector universities in Pakistan.

Data collection tools

A valid and reliable instrument for data collection is essential (Donoghue et al.20102000). Usually a researcher has three options about the selection of the data collection tool: tool can be developed, tool can be adapted or tool can be adopted (Lisboa et al.,2010). The researchers did not go for the first two options because of monetary, logistic and time constraints. After comparing different available instruments the researchers decided to use the instrument prepared by the Fairfax County. Moreover, there was not any legal or moral hindrance in the use of the instrument because the authorities of Fairfax County had granted open permission to use their rubrics. The researchers are highly obliged and thankful to Fairfax County for their open permission for researchers. To test the rubrics in local Pakistani situation a pilot study was conducted by using the rubrics. Twelve prospective teachers were randomly selected from department of Education, University of Sargodha, Pakistan. The prospective teachers were called one by one. The researchers asked every student a few questions as a warm up activity. The warm up activity was not scored. The dialogue was conducted after this warm-up activity. After finishing dialogue, each of the prospective teachers was given one minute to write the points on the topic of their own choice. Each of the prospective teachers was informed that after one minute s/he would speak on the topic (independent task, monologue) for two to three minutes. They were also told that they might be interrupted during their speech and the assessor might ask them some questions. The researchers rated each of the prospective teachers against the rubrics. The instrument was found appropriate for the study in terms of reliability and validity.

Data collection

The data collection in terms of speaking skills was not an easy job. At one time, only one student was assessed. The average time for each student was 10 minutes. So, keeping in view the time constraint, the researchers trained three of their students who were doing internship in the Department of English, University of Sargodha. One of the researchers got the consent from the head of the departments through personal preliminary visit of all the three departments of education. He also discussed the availability of the student in the departments and appropriate time for data collection. The data collection team consisted of one of the researchers and three research assistant trained for the purpose. The data collection team personally visited all the three sampled universities and collected data as per schedule. The prospective teachers were called one by one and each was briefed about the data collection process including dialogue and monologue and asked for consent to collect the data. Only a few prospective teachers did not agree mainly due to personal engagements. After the warm-up activity, the dialogue was conducted followed by monologue. In monologue each of the prospective teachers was given one minute to write the points on the topic of their own choice. Each of the prospective teachers was informed that after one minute s/he would speak on the topic (independent task, monologue) for two to three minutes. They were also told that they might be interrupted during their speech and the assessor might ask them some questions. The researchers rated each of the prospective teachers against the rubrics.

Analysis and interpretation of data

The collected data were analysed in order to compare the oral skills, dialogue, monologue, task completion, compressibility, fluency, pronunciation, vocabulary and language control of the prospective teachers at entry and graduation level by calculating mean scores, standard deviation and t value.

Table1 shows that there is no significant difference in the English speaking skills of the prospective teacher at entry and graduation level. The prospective teachers at graduation level of one year teacher education program do not differ significantly with their juniors at entry level in either dialogue or monologue. This table shows that mean scores of dialogue are better than the mean scores of monologue. This trend is worldwide even if same content is used for assessing examinee performance on monologue and a dialogue (Giouroglou & Economides,2004). Findings revealed that items associated with dialogue may be easier for prospective teachers than the same items associated with identical monologue.
Table 1

Comparison of students regarding English Speaking Skills at Entry and Graduation Level

Variable

Level

N

Mean

Std. Dev.

t

Sig. (2-tailed)

Speaking Skills

Entry

131

15.10

5.62

-0.569

0.51

Graduation

75

14.63

6.02

Dialogue

Entry

131

8.15

2.99

1.210

0.159

Graduation

75

7.54

3.00

Monologue

Entry

131

6.95

3.03

-0.305

0.761

 

Graduation

75

7.09

3.26

  
Table2 talks about ‘Dialogue’ whereas Table3 about ‘Monologue’. Both the tables show that the prospective teachers do not differ significantly in all the six criteria of the dialogue and the monologue. The areas are the task completion, comprehensibility, fluency, pronunciation, vocabulary, and language control. So, there is no progress in the English speaking skills of the prospective teachers at entry and graduation level in one year teacher education program.
Table 2

Comparison of students regarding dialogue at Entry and Graduation Level

Variable

Level

N

Mean

Std. Dev.

t

Sig. (2-tailed)

Dialogue

Entry

131

8.15

2.99

1.210

0.159

Graduation

75

7.54

3.00

Task completion

Entry

131

1.42

0.63

0.481

0.631

Graduation

75

1.38

0.55

Compressibility

Entry

131

1.42

0.61

1.006

0.315

Graduation

75

1.33

0.54

Fluency

Entry

131

1.35

0.61

0.627

0.532

Graduation

75

1.30

0.55

Pronunciation

Entry

131

1.30

0.51

1.908

0.058

Graduation

75

1.16

0.54

Vocabulary

Entry

131

1.30

0.53

1.100

0.273

Graduation

75

1.22

0.52

Language Control

Entry

131

1.25

0.52

1.294

0.197

 

Graduation

75

1.15

0.52

  
Table 3

Comparison of students regarding monologue at Entry and Graduation Level

Variable

Level

N

Mean

Std. Dev.

t

Sig. (2-tailed)

Monologue

Entry

131

6.95

3.03

-0.305

0.761

Graduation

75

7.09

3.26

Task completion

Entry

131

1.26

0.62

-0.426

0.671

Graduation

75

1.29

0.64

Compressibility

Entry

131

1.24

0.59

-0.224

0.823

Graduation

75

1.26

0.62

Fluency

Entry

131

1.20

0.64

0.527

0.599

Graduation

75

1.15

0.65

Pronunciation

Entry

131

1.12

0.50

-0.147

0.883

Graduation

75

1.13

0.56

Vocabulary

Entry

131

1.06

0.49

-1.309

0.192

Graduation

75

1.16

0.57

Language Control

Entry

131

1.07

0.53

-0.231

0.818

 

Graduation

75

1.09

0.55

  

Results and Discussion

The results of this study reveal that there is no improvement in the English speaking skills after one complete academic year, despite the fact that the medium of instruction and assessment is English. It might be due to two main reasons: assessment of writing skills only and use of local language in instruction as both the teacher educators and prospective teachers share same local language in Pakistan. Firstly, any language can be decomposed into four basic skills: reading, writing, listening and speaking. But in actual practice only one (out of four) skills written English is assessed in Pakistan (Khan,2013). Assessment has much more influence on students’ learning behaviour as compared to that of the teaching (Gibbs & Simpson,2004). Students try to find out the hidden curriculum (skills to be assessed). After the discovery of the hidden curriculum, students divert their effort towards writing skills only. So, the students get prepared for written examination according to the requirements of the assessment. They do not pay any heed to spoken skills which are not assessed. From the students’ point of view, assessment always defines the actual curriculum, which is in writing skills only (Ramsden,1992). There is worldwide recognition that oral skills need to be assessed (Wilde et al.2009). This is a reminder to the Pakistani universities where students pass the examinations without any improvement in oral skills. The pressure to maximize examination scores shifts the emphasis on those tasks only which are assessed.

Secondly, in Pakistan university teacher educators use Urdu language to accommodate prospective teachers from all language backgrounds but in practice it seems that they do so to cover their own deficiencies (Khan,2013). This is pertinent to note here that there is no aptitude and entry test for getting admission in teacher education programs in Pakistan, only marks in pervious degree are considered for admission. The results of the study conform to those of other studies in Pakistan but contrast to study conducted in United Arab Emirate (Khan,2013; Rogier et al.2012). It seems that if teacher educator and prospective teachers share the same local language, the improvement of spoken second language hinders.

Conclusions

Assessment of Pakistani students’ English speaking skills at Entry and Graduation Level revealed that there was no change in the English speaking skills after one complete academic year. The spoken language needs to be included in assessment process to improve speaking skills in teacher education programs in Pakistan.

Declarations

Authors’ Affiliations

(1)
Department of Education, University of Sargodha
(2)
Department of English, University of Sargodha

References

  1. Ahmad N: Analyzing the Spoken English Needs in Pakistani Academic Legal Settings. Pakistan Journal of Social Sciences (PJSS) 2011, 31(2):449–469.Google Scholar
  2. Akiyama T: Assessing speaking in Japanese junior high schools: Issues for the senior high school entrance examinations. Shiken: JALT Testing & Evaluation SIG Newsletter 2003, 7(2):2–11.Google Scholar
  3. Alam M: Assessment of Oral Skills Development among the Students of Master in Education in the Public Sector Universities of Punjab. University of Sargodha Sargodha: Master of Philosophy; 2012.Google Scholar
  4. Ali A, Tariq RH, Topping J: Students' perception of university teaching behaviours. Teaching in Higher Education 2009, 14(6):631–647. 10.1080/13562510903315159View ArticleGoogle Scholar
  5. Ali A, Tariq RH, Topping KJ: Perspectives of academic activities in universities in Pakistan. Journal of Further and Higher Education 2012, 1–28. ahead-of-printGoogle Scholar
  6. Allen D, Tanner K: Rubrics: Tools for making learning goals and evaluation criteria explicit for both teachers and learners. CBE-Life Sciences Education 2006, 5(3):197–203. 10.1187/cbe.06-06-0168View ArticleGoogle Scholar
  7. Aslam HD: Analyzing Professional Development Practices for Teachers in Public Universities of Pakistan. Mediterranean Journal of Social Sciences 2011, 2(4):97–106.Google Scholar
  8. Bashir M: Factor Effecting Students’ English Speaking Skill. British Journal of Arts & Social Sciences 2011, 2(1):34–50.Google Scholar
  9. Best JW, Kahn JV: Research in Education. 10th edition. Singapore: Allyn and Bacon; 2005.Google Scholar
  10. Bilal HA, Rehman A, Rashid A, Adnan R, Abbas M: Problems in Speaking English with L2 Learners of Rural Area Schools of Pakistan. Language in India 2013, 13(10):1220–1235.Google Scholar
  11. Boud D: Assessment & the promotion of academic values. Studies in Higher Education 1990, 15(1):101–111. 10.1080/03075079012331377621View ArticleGoogle Scholar
  12. Boud D, Falchikov N: Aligning assessment with long‐term learning. Assessment & Evaluation in Higher Education 2006, 31(4):399–413. 10.1080/02602930600679050View ArticleGoogle Scholar
  13. Breakwell GM, Hammond S, Fife-Schaw C: Research Methods in Psychology. London: Sage; 2010.Google Scholar
  14. Bruton A: The Vocabulary Knowledge Scale: A Critical Analysis. Language Assessment Quarterly 2009, 6(4):288–297. 10.1080/15434300902801909View ArticleGoogle Scholar
  15. Cammarata L: Foreign language teachers’ struggle to learn content-based instruction. L2 Journal 2010, 2: 1.Google Scholar
  16. Cohen L, Manion L, Morrison K: Research methods in education. London: Routledge; 2011.Google Scholar
  17. Coleman H: Teaching and learning in Pakistan: The role of language in education. Islamabad: British Council; 2010.Google Scholar
  18. Crooks T: The impact of classroom evaluation practices on students. Review of Educational Research 1988, 58(4):438–481. 10.3102/00346543058004438View ArticleGoogle Scholar
  19. Donoghue A, Nishisaki A, Sutton R, Hales R, Boulet J: Reliability and validity of a scoring instrument for clinical performance during Pediatric Advanced Life Support simulation scenarios. Resuscitation 2010, 81(3):331–336. 10.1016/j.resuscitation.2009.11.011View ArticleGoogle Scholar
  20. Fraenkel JR, Wallen NE: How to design and evaluate research in education. London: McGraw-Hill; 2000.Google Scholar
  21. Gay LR: Educational Research: Competencies for Analysis and Application. New Jersy, USA: Prentice Hall; 2008.Google Scholar
  22. Gibbs G, Simpson C: Conditions under which assessment supports students’ learning. Learning and Teaching in Higher Education 2004, 1(1):3–31.Google Scholar
  23. Giouroglou H, Economides A: State-of-the-art and adaptive open-closed items in adaptive foreign language assessment. In Proceedings 4th Hellenic Conference with International Participation: Informational and Communication Technologies in Education, Vol. A. Athens; 2004:747–756. 29 September – 03 October (2004). New Technologies publ. ISBN: 960–88359–1-7. http://www.conta.uom.gr/conta/publications/PDF/Adaptive%20FLA%20state%20of%20the%20art.pdf 29 September – 03 October (2004). New Technologies publ. ISBN: 960-88359-1-7.Google Scholar
  24. Government-of-Pakistan: National Professional Standards for Teachers in Pakistan. Pakistan: Islamabad Ministry of Education; 2009.Google Scholar
  25. Heywood J: Assessment in Higher Education. New York: John Wiley & Sons; 1989.Google Scholar
  26. Honig AS: Oral language development. Early Child Development and Care 2007, 177(6–7):581–613.View ArticleGoogle Scholar
  27. Huang S: Pushing Learners to Work Through Tests & Marks: Motivating or Demotivating? A Case in a Taiwanese University. Language Assessment Quarterly 2012, 9: 60–77. doi:10.1080/15434303.2010.510898View ArticleGoogle Scholar
  28. Huxham M, Campbell F, Westwood J: Oral versus written assessments: a test of student performance & attitudes. Assessment & Evaluation in Higher Education 2012, 37(1):125–136. 10.1080/02602938.2010.515012View ArticleGoogle Scholar
  29. Karim A: Global Paradigm Shift in Pedagogy and English Language Teachers Development in Pakistan. International Journal of Asian Social Science 2012, 2(1):64–70.Google Scholar
  30. Khan HI: An investigation of two universities’ postgraduate students and their teachers’ perceptions of policy and practice of English medium of instruction (EMI) in Pakistani universities. PhD Thesis. UK: University of Glasgow; 2013. http://theses.gla.ac.uk/4451/1/2013Khanphd2.pdfGoogle Scholar
  31. Khushi Q, Talaat M: Evaluation of the English Language Teaching (ELT) Textbooks Taught at the Pakistan Military Academy, Kakul. Language in India 2011, 11(8):75–88.Google Scholar
  32. Knight B: Assessing speaking skills: a workshop for teacher development. ELT Journal 1992, 46(3):294–302. 10.1093/elt/46.3.294View ArticleGoogle Scholar
  33. Lee Y: The Multimedia Assisted Test of English Speaking: The SOPI Approach. Language Assessment Quarterly 2007, 4(4):352–366. 10.1080/15434300701533661View ArticleGoogle Scholar
  34. Lindblad M: Communication Strategies in Speaking English as a Foreign Language: in the Swedish 9th grade national test setting. Sweden: University of Gävle; 2011. http://www.diva-portal.org/smash/get/diva2:453878/FULLTEXT01.pdfGoogle Scholar
  35. Lisboa LB, Garcia VC, Lucrédio D, de Almeida ES, de Lemos Meira SR, de Mattos Fortes RP: A systematic review of domain analysis tools. Information and Software Technology 2010, 52(1):1–13. 10.1016/j.infsof.2009.05.001View ArticleGoogle Scholar
  36. Maguire MJ, Hirsh-Pasek K, Golinkoff RM, Imai M, Haryu E, Vanegas S, Sanchez-Davis B: A developmental shift from similar to language-specific strategies in verb acquisition: A comparison of English, Spanish, and Japanese. Cognition 2010, 114(3):299–319. 10.1016/j.cognition.2009.10.002View ArticleGoogle Scholar
  37. Montgomery K: Authentic tasks and rubrics: Going beyond traditional assessments in college teaching. College Teaching 2002, 50(1):34–40. 10.1080/87567550209595870View ArticleGoogle Scholar
  38. Newble DI, Jaeger K: The effects of assessment & examinations on the learning of medical students. Medical Education 1983, 20: 162–175.View ArticleGoogle Scholar
  39. Rahman T: Language policy, multilingualism and language vitality in Pakistan. Trends in Liguistics Studies and Monographs 2006, 175: 73.Google Scholar
  40. Ramsden P: Learning to Teach in Higher Education. London: Routledge; 1992.View ArticleGoogle Scholar
  41. Rogier D, Coombe C: The effects of English-medium instruction on language proficiency of students enrolled in higher education in the UAE. PhD Thesis. UK: The University of Exeter; 2012. https://ore.exeter.ac.uk/repository/bitstream/handle/10036/4482/RogierD.pdf?sequence=2Google Scholar
  42. Sadler DR: Interpretations of criteria-based assessment & grading in higher education. Assessment & Evaluation in Higher Education 2005, 30(2):175–194. 10.1080/0260293042000264262View ArticleGoogle Scholar
  43. Sánchez L: Bilingualism/second-language research and the assessment of oral proficiency in minority bilingual children. Language Assessment Quarterly: An International Journal 2006, 3(2):117–149. 10.1207/s15434311laq0302_3View ArticleGoogle Scholar
  44. Sarwar M, Shah AA, Alam HM, Hussian S: Usefulness and Liking of English Language as perceived by university students in Pakistan. Archives Des Sciences 2012, 65: 3.Google Scholar
  45. Shahzad S, Ali R, Qadeer MZ, Ullah H: Identification of the causes of students’ low achievement in the subject of English. Asian Social Science 2011, 7(2):p168.View ArticleGoogle Scholar
  46. Simon E, Taverniers M: Advanced EFL learners' beliefs about language learning and teaching: a comparison between grammar, pronunciation, and vocabulary. English Studies 2011, 92(8):896–922. 10.1080/0013838X.2011.604578View ArticleGoogle Scholar
  47. Tariq AR, Bilal HA, Sandhu MA, Iqbal A, Hayat U: Difficulties in Learning Englsih as Second Language in Rural Areas of Pakistan. НОВЫЙ УНИВЕРСИТЕТ 2013, 4(29):24–34.Google Scholar
  48. Urlaub P, VanderHeijden V, Tokudome M, Kayi H, Yasui E: Texas Papers in Foreign Language Education (TPFLE). 2010.Google Scholar
  49. Wedell M: Developing a capacity to make “English for Everyone” worthwhile: Reconsidering outcomes and how to start achieving them. International Journal of Educational Development 2008, 28(6):628–639. 10.1016/j.ijedudev.2007.08.002View ArticleGoogle Scholar
  50. Wilde J, Kreamelmeyer K, Buckner B: Construction, administration and validation of a written and oral language assessment in an undergraduate teacher education programme. Assessment & Evaluation in Higher Education 2009, 34(5):595–602. 10.1080/02602930802255147View ArticleGoogle Scholar
  51. William W: Research methods in education. India: Pearson Education; 2009.Google Scholar
  52. Wu JR: GEPT and English language teaching and testing in Taiwan. Language Assessment Quarterly 2012, 9(1):11–25. 10.1080/15434303.2011.553251View ArticleGoogle Scholar

Copyright

© Sarwar et al.; licensee Springer. 2014

This article is published under license to BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly credited.