GENDER-STEREOTYPED LEXICAL UNITS IN MACHINE TRANSLATION SYSTEMS (BASED ON THE MATERIAL OF THE UKRAINIAN AND ENGLISH LANGUAGES)

Authors

DOI:

https://doi.org/10.32782/2522-4077-2025-215-27

Keywords:

gender bias, machine translation, gender stereotypes, gender-sensitive lexis, gender-marked lexis

Abstract

The article provides a comprehensive analysis of the mechanisms underlying the formation and reproduction of gender bias in machine translation systems based on neural models and statistical regularities of corpus data. It elucidates the specific features of algorithmic generalization of sociolinguistic patterns, which result in the reproduction of culturally entrenched stereotypes when translating from English into Ukrainian. The study examines the key types of bias – the stereotypical attribution of gender through additional characteristics (descriptive adjectives) as well as through socially conditioned domestic roles, hobbies, and professional designations. The findings reveal that contemporary machine translation systems (DeepL, Google Translate, Microsoft Translator, OpenL) consistently reproduce associative pairs such as “femininity – emotionality, care” and “masculinity – activity, strength,” which exemplify the algorithmic amplification of social stereotypes during automated semantic processing. It was found that, in the presence of contextual markers, translation systems tend to disregard semantic coherence within the sentence, selecting grammatical gender based on the statistical frequency of corpus collocations. This simplification, characteristic of neural models, indicates limited contextual recognition and insufficient cognitive differentiation of socially marked features. Semantic analysis of the translations demonstrates that adjectives and verbs, when combined, function not only as lexical but also as cultural and value-laden indicators. The study emphasizes that this form of bias is not a mere technical error but rather the result of an interplay between linguistic, cognitive, and sociocultural factors that reflect asymmetries present in the source training corpora. According to the concept of algorithmic amplification, the gender-differentiated patterns embedded in corpora are not merely replicated but intensified by algorithms that generalize and “normalize” discursive inequality.

References

Rodrigues Dias S. Gender Bias in Machine Translation and the Importance of Diversity. Translation. URL: https://verbarium-boutique.com/gender-bias-in-machine-translation-and-the-importance-ofdiversity/(date of access: 01.11.2025).

Rescigno A. A., Monti J. Gender Bias in Machine Translation: a statistical evaluation of Google Translate and DeepL for English, Italian and German. International Conference on Human-informed Translation and Interpreting Technology 2023, Naples, 7–9 July 2023. DOI: 10.26615/issn.2683-0078.2023_001 (date of access: 05.10.2025).

Měchura M. 10 things you should know about gender bias in machine translation. Male and female. URL: https://www.fairslator.com/10-things-about-gender-bias-in-mt (date of access: 02.11.2025).

Liao Y. L. Gender Bias in Neural Machine Translation : Senior Capstone Thesis. Philadelphia, 2021. 41 p. URL: https://www.cis.upenn.edu/wp-content/uploads/2021/10/Senior_Thesis_Yuxin_Liao.pdf.

Triboulet B., Bouillon P. Evaluating the Impact of Stereotypes and Language Combinations on Gender Bias Occurrence in NMT Generic Systems. Proceedings of the Third Workshop on Language Technology for Equality, Diversity and Inclusion, Varna, 7 September 2023. Shoumen, 2023. P. 62–70. URL: https://aclanthology.org/2023.ltedi-1.9/ (date of access: 26.10.2025).

Ghosh S., Caliskan A. ChatGPT Perpetuates Gender Bias in Machine Translation and Ignores Non-Gendered Pronouns: Findings across Bengali and Five other Low-Resource Languages. AIES '23: Proceedings of the 2023 AAAI/ACM Conference on AI, Ethics, and Society, Montreal, 8–10 August 2023. 2023. P. 901–912. DOI: 10.1145/3600211.3604672 (date of access: 05.10.2025).

Stanovsky G., Smith N. A., Zettlemoyer L. Evaluating Gender Bias in Machine Translation. Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, Florence, 11 July 2019. P. 1679–1684. URL: https://aclanthology.org/P19-1164/ (date of access: 25.10.2025).

Saunders D., Byrne B. Reducing Gender Bias in Neural Machine Translation as a Domain Adaptation Problem. Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, 17 July 2020. 2020. P. 7724–7736. URL: https://aclanthology.org/2020.acl-main.690/ (date of access: 02.11.2025).

Gete H., Etchegoyhen T. Does Context Help Mitigate Gender Bias in Neural Machine Translation?. Findings of the Association for Computational Linguistics: EMNLP 2024, Miami, 7 November 2024. Miami, Florida, 2024. P. 14788–14794. URL: https://aclanthology.org/2024.findings-emnlp.868/ (date of access: 02.11.2025).

Мороз М. Відтворення гендерних стереотипів у системах машинного перекладу (на матеріалі української та англійської мов). Закарпатські філологічні студії. 2025. Т. 2, № 29. С. 136–143. DOI: 10.32782/tps2663-4880/2025.39.2.24

Herger M. Gender Stereotyping in Google Translate. Techno | Phil | oSoph. URL: https://technophilosoph.com/en/2021/03/18/gender-stereotyping-in-google-translate/ (date of access: 12.10.2025).

Fiske S. T., Taylor S. E. Social Cognition: From Brains to Culture. 4th ed. London : SAGE Publications Ltd, 2021. 653 p. DOI: 10.4135/9781529681451.

Кузьомська О. Чоловік, якого створили стереотипи | Листи до приятелів. Листи до приятелів | Національна ідея – це ідеали, закодовані в суцільній історичній пам'яті народу. Згадати себе! Повірити в себе! Бути собою!. URL: https://lysty.net.ua/stereotypes-2/ (дата звернення: 11.10.2025).

Basow S. A. Gender Stereotypes: Traditions and Alternatives. 2nd ed. Monterey : Brooks/Cole Pub. Co., 1986. 399 p.

Шевченко З. В. Словник ґендерних термінів. Черкаси : Чабаненко Ю., 2016. 336 с.

Мороз М. Особливості перекладу гендерно-маркованих лексичних одиниць у сучасному англомовному медійному дискурсі. Вісник науки та освіти. 2023. № 6(12). С. 149–161. DOI: 10.52058/2786-6165-2023-6(12)-149-161

Mellon S. K.-C. Stereotypes in language may shape bias against women in STEM. Futurity. URL: https://www.futurity.org/women-in-stem-stereotypes-2420022-2/ (date of access: 02.11.2025).

Troles J.-D., Schmid U. Extending Challenge Sets to Uncover Gender Bias in Machine Translation: Impact of Stereotypical Verbs and Adjectives. Extending Challenge Sets to Uncover Gender Bias in Machine Translation: Impact of Stereotypical Verbs and Adjectives : Sixth Conference on Machine Translation, 12 November 2021. P. 531–541. URL: https://aclanthology.org/2021.wmt-1.61/ (date of access: 01.11.2025).

Yanxue L. Construction and Application of the Lacuna’s Translation Model in Modern Linguistics. Alfred Nobel University Journal of Philology. 2023. Vol. 2, no. (26/1). P. 274–286. DOI: 10.32342/2523-4463-2023-2-26/1-20

Андрушко Л. Ґендерні стереотипи в українській телерекламі. Вісник Львівської національної академії мистецтв. 2012. № 23. С. 397–407.

Gender Bias in Machine Translation / B. Savoldi et al. Transactions of the Association for Computational Linguistics. 2021. Vol. 9. P. 845–874. DOI: 10.1162/tacl_a_00401.

Gonen H., Webster K. Automatically Identifying Gender Issues in Machine Translation using Perturbations. Findings of the Association for Computational Linguistics: EMNLP 2020, 20 November 2020. P. 1991–1995. DOI: 10.18653/v1/2020.findings-emnlp.180 (date of access: 12.10.2025).

Prates M. O. R., Avelar P. H. C., Lamb L. Assessing gender bias in machine translation: a case study with Google Translate. Neural Computing and Applications. 2018. No. 32. P. 6363–6381. DOI: 10.1007/s00521-019-04144-6 (date of access: 04.10.2025).

Vanmassenhove E. Gender Bias in Machine Translation and The Era of Large Language Models. Gendered Technology in Translation and Interpreting. 2024. P. 225–252. URL: https://arxiv.org/pdf/2401.10016 (date of access: 02.10.2025).

Does machine translation reinforce gender bias?. TEST Terra Team Up. URL: https://test.terrateamup.com/2022/06/21/does-machine-translation-reinforce-gender-bias/ (date of access: 01.11.2025).

Gender bias in machine translation. Cadenza Academic Translations. URL: https://www.cadenzaacademictranslations.com/blog/2022/03/06/gender-bias-in-machine-translation/ (date of access: 02.11.2025).

Чорноморченко Е. Історія рожевого кольору: коли і чому він став винятково жіночим – bit.ua Медіа про життя і технології в ньому. bit.ua Медіа про життя і технології в ньому. URL: https://bit.ua/2020/11/hystory-pink-colour/ (дата звернення: 06.10.2025).

Published

2025-12-30

How to Cite

Moroz, M. Y. (2025). GENDER-STEREOTYPED LEXICAL UNITS IN MACHINE TRANSLATION SYSTEMS (BASED ON THE MATERIAL OF THE UKRAINIAN AND ENGLISH LANGUAGES). Наукові записки. Серія: Філологічні науки, (215), 228–237. https://doi.org/10.32782/2522-4077-2025-215-27