UTILIZING GRAMMARLY IN EVALUATING ACADEMIC WRITING: A NARRATIVE RESEARCH ON EFL STUDENTS’ EXPERIENCE

Muhamad Nova

Abstract


With the development of technology, any writer now can easily check their academic writing with automated writing evaluation program. Though, the utilization of this program may bring both benefits and drawbacks. Thus, a consideration of its strengths and weaknesses is needed. To fill the need, this study aimed to identify the strengths and weaknesses of Grammarly program as an automated writing evaluation program in evaluating academic writing. Using a narrative inquiry in exploring three Indonesian postgraduate students’ experiences by conducting interview and documentation, the result showed that this program has provided useful color-coded feedback with explanation and example, ease of account access, high rate of evaluation speed, and free service for evaluating academic writing. However, some caveats were also found in this program utilization, such as several misleading feedbacks, weaknesses on detecting the type of English and reference list, and lack of context and content evaluation experienced, which became the weaknesses of this program. Further investigation on the efficiency of the feedback given by Grammarly in improving students’ writing quality is needed.


Keywords


academic writing; automated writing evaluation; experience; Grammarly

Full Text:

PDF

References


Chen, C. F. E., & Cheng, W. Y. E. (2008). Beyond the design of automated writing evaluation: Pedagogical practices and perceived learning effectiveness in EFL writing classes. Language Learning & Technology, 12(2), 94-112. Retrieved from https://www.researchgate.net/publication/45681611

Chou, H. C., Moslehpour, M., & Yang, C. Y. (2016). My access and writing error corrections of EFL college pre-intermediate students. International Journal of Education, 8(1), 144-161. Retrieved from https://www.researchgate.net/publication/299535444

Cotos, E. (2011). Potential of automated writing evaluation feedback. CALICO Journal, 28(2), 420-459. Retrieved from https://www.researchgate.net/publication/274630246

Creswell, J. W. (2012). Educational research: Planning, conducting, and evaluating quantitative and qualitative research (4th ed.). Boston: Pearson Education.

Ebyary, K. E., & Windeatt, S. (2010). The impact of computer-based feedback on students’ written work. International Journal of English Studies, 10(2), 121-142. Retrieved from http://files.eric.ed.gov/fulltext/EJ936915.pdf

Ferster, B, Hammond, T. C., Alexander, C., & Lyman, H. (2012). Automated formative assessment as a tool to scaffold student documentary writing. Journal of Interactive Learning Research, 23(1), 21-39. Retrieved from https://eric.ed.gov/?id=EJ979112

Grammarly. (2017). About Grammarly. Retrieved from https://support.grammarly.com/hc/en-us/categories/115000018611-About-Grammarly

Roscoe, R. D., Wilson, J., Johnson, A. C., & Mayra, C. R. (2017). Presentation, expectations, and experience: Sources of student perceptions of automated writing evaluation. Computers in Human Behavior, 70, 207-221. doi: 10.1016/j.chb.2016.12.076.

Scharber, C., Dexter, S., Riedel, E. (2008). Students’ experiences with an automated essay scorer. The Journal of Technology, Learning, and Assessment, 7(1), 1-45. Retrieved from http://www.jtla.org

Wang, F. & Wang, S. (2012). A comparative study on the influence of automated evaluation system and teacher grading on students’ English writing. Procedia Engineering, 29(2012), 993-997. doi:10.1016/j.proeng.2012.01.077

Wang, P. L. (2013). Can automated writing evaluation programs help students improve their English writing? International Journal of Applied Linguistics & English Literature, 2(1), 6-12. doi:10.7575/ijalel.v.2n.1p.6

Wilson, J., & Andrada, G. N. (2016). Using automated feedback to improve writing quality: opportunities and challenges. In Rosen, Y.,

Ferrara, S., Mosharraf, M. (eds). (2016). Handbook of Research on Technology Tools for Real-World Skill Development (pp. 678-703). IGI Global: US. doi: 10.4018/978-1-4666-9441-5.ch026

Wilson, J. (2016). Associated effects of automated essay evaluation software on growth in writing quality for students with and without disabilities. Reading and Writing, 30(4), 691-718. doi: 10.1007/s11145-016-9695-z




DOI: http://dx.doi.org/10.24127/pj.v7i1.1332

Refbacks

  • There are currently no refbacks.


Copyright (c) 2018 Muhamad Nova

Flag Counter
Indexing:

DOAJ , IPI  , Portal Garuda, SINTA, University Library, Google Scholar