نوع مقاله : مقاله پژوهشی
نویسندگان
1 استادیار گروه آموزش زبان و ادبیات فارسی، دانشگاه فرهنگیان، تهران، ایران.
2 نویسنده مسئول، دانشیار گروه زبانهای خارجی، کالج بین الملل دانشگاه، دانشگاه علوم پزشکی تهران، ایران
چکیده
کلیدواژهها
موضوعات
عنوان مقاله [English]
نویسندگان [English]
Assessment plays a pivotal role in second language teaching and learning. Consequently, it is essential that the stakeholders involved in instruction, assessment, and evaluation – including teachers, assessment specialists, and researchers – gain and possess adequate knowledge in this domain. Although most instructors teaching Persian to speakers of other languages (TPSOL) undergo pre-service teacher training programs, their performance in test development-related tasks often remains virtually unsatisfactory. This issue may, in part, stem from the conceptual ambiguity surrounding assessment literacy (AL) in language education. This study aims to examine TPSOL teachers’ deficiencies in AL, their perceived needs in this area, and their views on the practical importance of its various components. To this end, the assessment literacy for the language classroom questionnaire (Fulcher, 2012) was administered to 45 TPSOL instructors. The findings revealed that the most frequently identified need among participants was related to scoring the assessment of productive skills (i.e., speaking and writing), while the least prioritized area was knowledge of language testing history. Additionally, a significant relationship was found between the participants’ academic majors and their emphasis on writing test specifications. Specifically, instructors with degrees in Teaching English as a Foreign Language (TEFL), Teaching Persian as a Foreign Language (TPFL, also known as AZFA), Linguistics, and Persian Language and Literature, respectively, attached the greatest importance to this component. The study concludes with further findings and pedagogical implications for teacher education and professional development programs.
Extended Abstract
Introduction
Assessment is a central component of language education, shaping both teaching practices and learner outcomes. In the context of teaching Persian to speakers of other languages (TPSOL), the role of teachers in designing, implementing, and interpreting language assessments is crucial. Despite undergoing pre-service teacher training, many instructors lack sufficient competence in language test development. This issue may partly stem from the conceptual ambiguity surrounding assessment literacy in language education. Addressing this gap, the present study explores the needs, perceptions, and challenges of Persian language instructors concerning assessment literacy, aiming to provide insights into how assessment practices can be enhanced in this specialized field. Specifically, this study was designed to investigate the perceived needs of teachers of Persian to speakers of other languages in relation to language assessment literacy, the relationship between teachers’ assessment literacy and their professional experience, and the relationship between assessment literacy components and teachers’ academic background.
Theoretical Framework
The study is grounded in the conceptual framework of language assessment literacy, which includes teachers’ theoretical knowledge, practical skills, and ethical awareness regarding assessment. Drawing on the works from scholars such as Fulcher (2012), Popham (2004), and Inbar-Lourie (2013), the study emphasizes assessment literacy as a multidimensional construct involving technical, pedagogical, and contextual knowledge. Teachers with strong assessment literacy are better equipped to evaluate student performance, make instructional decisions, and communicate results effectively.
Methods
A descriptive survey design was employed using a questionnaire developed by Fulcher (2012), specifically intended to assess language teachers’ assessment literacy. The questionnaire included 23 core items rated on a five-point Likert scale (from "not important at all" to "very important"), along with demographic items related to academic degree, field of study, teaching experience, and prior involvement in test development (e.g., SAMFA exam). The sample consisted of 45 instructors teaching Persian to non-Persian speakers at several universities in Iran, including Tehran University of Medical Sciences, Allameh Tabataba’i University, Imam Khomeini International University, and others. Among the participates, 57.8% held a Master’s degree and 42.2% held a Ph.D. Their academic backgrounds were as follows: Teaching Persian to Speakers of other Languages (64.4%), Linguistics (20%), Persian Language and Literature (13.3%), and English Language Teaching (AKA TEFL) (2.2%). Their teaching experience ranged from less than one year to over five years, with the majority (66.7%) having more than five years of teaching experience. Nearly half (48.9%) had more than five years of experience in language test development.
Results
Needs Assessment
The most highly rated areas of need included scoring productive skills (speaking and writing), writing assessment tasks and questions, and conducting classroom-based assessment. Conversely, the lowest-rated item was knowledge of the history of language testing, suggesting that teachers prioritized practical assessment skills over theoretical knowledge.
Experience-Based Differences
Using the Kruskal-Wallis test, significant relationships were found between teachers’ years of teaching experience and several assessment literacy components. Teachers with 1-3 years of experience placed higher value on test task writing, reliability, classroom assessment, large-scale test development, and test standards. In addition, those with more than five years of experience showed moderate emphasis, while teachers with less than one year of experience placed the least importance on these components.
Disciplinary Background Differences
A significant difference was observed between teachers’ academic fields and their emphasis on test blueprint development. Teachers with degrees in English Language Teaching rated this component highest, followed by those in TPSOL, Linguistics, and Persian Literature.
Discussion
The findings suggest that although the instructors recognized the importance of assessment, their understanding of key concepts – reliability, test theory, and ethical issues – remains limited. While many instructors carry out assessment tasks regularly, their formal training and education in these areas appear insufficient. Furthermore, assessment literacy varies significantly depending on teaching experience and educational background, pointing to the need for more tailored professional development.
Implications
Based on the findings, this study underscores the urgent need to incorporate comprehensive assessment literacy modules into teacher education programs for TPSOL instructors. Also, it is essential to design in-service training sessions focused on both theoretical foundations and practical applications of language assessment. Promoting reflective assessment practices which are culturally and pedagogically responsive to the needs of diverse learners is of significance. Additionally, policymakers and curriculum designers should consider establishing national standards for assessment literacy to ensure consistency and quality in assessment practices across Persian language programs.
Conclusions
Assessment literacy is not merely a technical competency but also a pedagogical competency essential for effective language instruction. As this study reveals, TPSOL instructors, particularly novice and non-specialist instructors, require targeted support in developing robust assessment competencies. Addressing these needs will not only enhance instructional effectiveness but also ensure more equitable and valid evaluations of learner progress.
کلیدواژهها [English]