Paper Status Tracking
Contact us
[email protected]
Click here to send a message to me 3275638434
Paper Publishing WeChat

Article
Affiliation(s)

Northwestern Polytechnical University, Xi’an, China

ABSTRACT

Machine translation of low-resource languages (LRLs) has long been hindered by limited corpora and linguistic complexity. This review summarizes key developments, from traditional methods to recent progress with large language models (LLMs), while highlighting ongoing challenges such as data bottlenecks, biases, fairness, and computational costs. Finally, it discusses future directions, including efficient parameter fine-tuning, multimodal translation, and community-driven corpus construction, providing insights for advancing LRL translation research.

KEYWORDS

low-resource languages (LRLs) , machine translation, large language models (LLMs)

Cite this paper

Journal of Literature and Art Studies, September 2025, Vol. 15, No. 9, 725-731

References

Alam, F., Chowdhury, S. A., Boughorbel, S., & Hasanain, M. (2024). LLMs for Low Resource Languages in Multilingual, Multimodal and Dialectal Settings. In Proceedings of the 18th Conference of the European Chapter of the Association for Computational Linguistics: Tutorial Abstracts, 27–33.

Anugu, A., & Ramesh, G. (2020). A Survey on Hybrid Machine Translation. E3S Web of Conferences, 184, 01061.

Arcan, D. T. N. P. B. R. C. M. M. a. M. (2019). Leveraging Rule-Based Machine Translation Knowledge for Under-Resourced Neural Machine Translation Models.

Aycock, S., Stap, D., Wu, D., Monz, C., & Sima’an, K. (2024). Can LLMs Really Learn to Translate a Low-Resource Language from One Grammar Book? (arXiv preprint. arXiv:2406.10216v2).

Bapna, A., & Firat, O. (2019). Simple, Scalable Adaptation for Neural Machine Translation. In Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), 1538–1548.

Burchell, L., & Birch And Kenneth Heafield, A. (2022). Exploring diversity in back translation for low-resource machine translation. In Proceedings of the Third Workshop on Deep Learning for Low-Resource Natural Language Processing, 67–79.

Chronopoulou, A., Stojanovski, D., & Fraser, A. (2023). Language-Family Adapters for Low-Resource Multilingual Neural Machine Translation. In Proceedings of the Sixth Workshop on Technologies for Machine Translation of Low-Resource Languages (LoResMT 2023) (pp. 177–191).

Connor, P. C. (2018). A Concept Specification and Abstraction-based Semantic Representation: Addressing the Barriers to Rule-based Machine Translation. arXiv: Computation and Language.

De Gibert, O., Vázquez, R., Aulamo, M., Scherrer, Y., Virpioja, S., & Tiedemann, J. (2023). Four approaches to low-resource multilingual NMT: The Helsinki submission to the AmericasNLP 2023 shared task. In Proceedings of the Workshop on Natural Language Processing for Indigenous Languages of the Americas (AmericasNLP) (pp. 177–191).

Ganesh, S., Dhotre, V., Patil, P., & Pawade, D. (2023). A Comprehensive Survey of Machine Translation Approaches. In 2023 6th International Conference on Advances in Science and Technology (ICAST) (pp. 160-165).

Gehring, J., Auli, M., Grangier, D., Yarats, D., & Dauphin, Y. N. (2017). Convolutional sequence to sequence learning Proceedings of the 34th International Conference on Machine Learning - Volume 70, Sydney, NSW, Australia.

Ghazvininejad, M., Gonen, H., & Zettlemoyer, L. (2023). Dictionary-based Phrase-level Prompting of Large Language Models for Machine Translation (arXiv preprint. arXiv:2302.07856v1).

Hameed, D. A., & Al-Khateeb, B. (2025). Deep Learning Techniques for Machine Translation: A Survey. Procedia Computer Science, 258, 1022–1037.

Hinton, G., Vinyals, O., & Dean, J. (2015). Distilling the Knowledge in a Neural Network (arXiv preprint. arXiv:1503.02531).

Hu, E. J., Shen, Y., Wallis, P., Allen-Zhu, Z., Li, Y., Wang, S., Wang, L., & Chen, W. (2021). LoRA: Low-Rank Adaptation of Large Language Models (arXiv preprint. arXiv:2106.09685).

Irvine, A. (2013, June). Statistical Machine Translation in Low Resource Settings. In A. Louis, R. Socher, J. Hockenmaier, & E. K. Ringger, Proceedings of the 2013 NAACL HLT Student Research Workshop Atlanta, Georgia.

Lu, H., Yang, H., Huang, H., Zhang, D., Lam, W., & Wei, F. (2024). Chain-of-dictionary prompting elicits translation in large language models. In Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing (pp. 958–976). Miami, FL, USA: Association for Computational Linguistics.

Nakov, P. I., & Ng, H. T. J. C. e. (2012). Improving Statistical Machine Translation for a Resource-Poor Language Using Related Resource-Rich Languages. 44(1), 179-222.

Prashanth Nayak, J. K., Rejwanul Haque, Andy Way. (2023). Instance-Based Domain Adaptation for Improving Terminology Translation. Proceedings of Machine Translation Summit XIX, Vol. 1: Research Track, 222–234.

Sel, I., & Hanbay, D. (2024). Efficient Adaptation: Enhancing Multilingual Models for Low-Resource Language Translation. Mathematics, 12(19), 3149.

Song, Y., Li, L., Lothritz, C., Ezzini, S., Sleem, L., Gentile, N., State, R., Bissyandé, T. F., & Klein, J. (2025). Is LLM the Silver Bullet to Low-Resource Languages Machine Translation? (arXiv preprint. arXiv:2503.24102).

Tanzer, G., Suzgun, M., Visser, E., Jurafsky, D., & Melas-Kyriazi, L. (2023). A Benchmark for Learning to Translate a New Language from One Grammar Book (arXiv preprint. arXiv:2307.01780v2).

Tonja, A. L., Kolesnikova, O., Gelbukh, A., & Sidorov, G. (2023). Low-Resource Neural Machine Translation Improvement Using Source-Side Monolingual Data. Applied Sciences, 13(2), 1201.

Wei, J., Bosma, M., Zhao, V. Y., Guu, K., Yu, A. W., Lester, B., Du, N., Dai, A. M., & Le, Q. V. (2021). Finetuned Language Models Are Zero-Shot Learners (arXiv preprint. arXiv:2109.01652v5).

Zhang, L., Zhang, L., Shi, S., Chu, X., & Li, B. (2023). LoRA-FA: Memory-efficient Low-rank Adaptation for Large Language Models Fine-tuning (arXiv preprint. arXiv:2308.03303).

About | Terms & Conditions | Issue | Privacy | Contact us
Copyright © 2001 - David Publishing Company All rights reserved, www.davidpublisher.com
3 Germay Dr., Unit 4 #4651, Wilmington DE 19804; Tel: 001-302-3943358 Email: [email protected]