Using large language models to prepare dictation texts
https://doi.org/10.30515/0131-6141-2026-87-2-37-45
Abstract
Large language models (LLMs) are being actively implemented in the educational process. However, their ability to generate high-quality educational texts that meet strict methodological and linguistic requirements remains understudied. This paper aims to evaluate the capabilities of LLMs in creating and editing dictation texts for philology students as well as to determine the role of teachers in this process. The article describes a linguistic experiment conduc ted in three stages using four models (DeepSeek, ChatGPT, Alice AI, and GigaChat. In stage one, the models generated texts based on a detailed prompt. In stage two, they corrected the texts using specific criteria. In stage three, the edits were evaluated by a philologist according to formal, structural, and error-focused criteria. The maximum possible score was 140 points. ChatGPT (up to 99 points) and Alice AI (up to 94 points) were the most successful LLMs in text generation. These models fulfil the formal requirements (length, vocabulary, syntax) well, but make factual, speech, and grammatical errors. No model performed all tasks flawlessly during text correction. ChatGPT achieved the best result, i.e., +4 points out of 10. Expert editing revealed the need for manual revision to ensure semantic coherence, stylistic clarity, and error elimination. LLMs are an effective tool for creating drafts of educational texts; nevertheless, they cannot guarantee text quality without human intervention. A hybrid model is recommended, namely text generation by the model followed by expert verification and editing by a philologist. This approach enables AI performance and professional linguistic expertise to be combined.
About the Authors
A. M. DundukovaRussian Federation
Angelina M. Dundukova, Candidate of Sciences (Philology), Senior Lecturer
Petrozavodsk
A. A. Lebedev
Russian Federation
Alexander A. Lebedev, Candidate of Sciences (Philology), Associate Professor
Petrozavodsk
References
1. Glotova M. Yu., Samokhvalova E. A., Mukhlynina O. A. Development of skills in neural net work technologies for future teachers: opportunities and advantages. Nauka i shkola = Science and School. 2023;(5):162–172. (In Russ.) https://doi.org/10.31862/1819-463X-2023-5-162-172.
2. Koryakova K. A., Sudakova O. V. Neural net works as new tools in education. Informatsionnye tekhnologii v obrazovanii = Information Technologies in Education. 2023;(6):180–186. (In Russ.)
3. Pisar N. V. The potential of using neural net works as an innovative tool for creating educational content and a means of organizing an interactive educational environment in classes of Russian as a foreign language. Filologicheskie nauki. Voprosy teorii i praktiki = Philological Sciences. Issues of theory and practice. 2024;17(1):58–65. (In Russ.) https://doi.org/10.30853/phil20240009.
4. Pichugov A. A., Namiot D. E., Zubareva E. V. Modern methods for training large language models with minimal data: From one example to absolute zero – an academic review. International Journal of Open Information Technologies. 2025;13(6):114–124. (In Russ.)
5. Kasneci E., Seller K., Kûchemann S. et al. ChatGPT for Good? On opportunities and challenges of large language models for education. Learning and Individual Differences. 2023;103:102274. (In Engl.) https://doi.org/10.1016/j.lindif.2023.102274.
6. Küchemann S., Avila K. E., Dinc Y. et al. On opportunities and challenges of large multi modal foundation models in education. npj Science of Learning. 2025;10:11. (In Engl.) https://doi.org/10.1038/s41539-025-00301-w.
7. Xing W., Nixon N., Crossley S. et al. The use of large language models in education. International Journal of Artificial Intelligence in Education. 2025;35:439–443. (In Engl.) https://doi.org/10.1007/s40593-025-00457-x.
Review
For citations:
Dundukova A.M., Lebedev A.A. Using large language models to prepare dictation texts. Russian language at school. 2026;87(2):37-45. (In Russ.) https://doi.org/10.30515/0131-6141-2026-87-2-37-45
JATS XML































