AEPH
Home > Conferences > Vol. 13. FSSD2025 >
Gender-Specific Public Perceptions of the Problem of Deepfake Technology and Support for Regulatory Policies under the Influence of Presumed Influence
DOI: https://doi.org/10.62381/ACS.FSSD2025.06
Author(s)
Can Jin
Affiliation(s)
Faculty of Arts and Social Sciences, University of Sydney, Sydney, New South Wales 2050, Australia
Abstract
With the rapid development of deepfakes, the potential threat to social trust, information authenticity and personal privacy has attracted attention. Baudrillard (1983) noted the increasingly visible transformation of versions of reality through simulation and mediated reproduction and warned that hyperreality could become indistinguishable from the reality of human existence. Deepfake increases the difficulty of distinguishing reality from the real thing. Little is known about the generative adversarial networks used to manipulate videos in deepfakes, but the media is full of reports of malicious uses of "deepfakes". The public's knowledge of deepfake technology basically comes from the media and others, so the public and their support for regulatory policies may be affected by the influence of others and the media. Based on the theoretical model of The Influence of Presumed Influence (IPMI), this study will explore how the public of different genders perceive the impact of deepfake technology and how this cognition affects the public's support for relevant regulatory policies, starting from the harmfulness of deepfake technology. The study uses a questionnaire survey method to analyze the risk perception, media influence, and policy support of the public of different genders regarding deepfake technology. It is found that women are more likely to believe that media content related to deepfakes will hurt other women or the public. Therefore, they will have a sense of risk perception and social responsibility and promote regulatory policies.
Keywords
Deepfake; The Influence of Presumed Influence; Gender Differences; Policy Support
References
[1] Westerlund, M. (2019). The Emergence of Deepfake Technology: A Review. Technology Innovation Management Review, 9(11), 39–52. https://timreview.ca/article/1282 [2] Rana, M. S., Nobi, M. N., Murali, B., & Sung, A. H. (2022). Deepfake Detection: A Systematic Literature Review. IEEE Access, 10, 25494–25513. https://doi.org/10.1109/access.2022.3154404 [3] Aldwairi, M., & Alwahedi, A. (2018). Detecting Fake News in Social Media Networks. Procedia Computer Science, 141(141), 215–222. https://doi.org/10.1016/j.procs.2018.10.171 [4] Chakhoyan, A. (2018, November 16). Deep fakes could destroy democracy. Can they be stopped? World Economic Forum. https://www.weforum.org/stories/2018/11/deep-fakes-may-destroy-democracy-can-they-be-stopped/ [5] Qayyum, A., Qadir, J., Janjua, M. U., & Sher, F. (2019). Using Blockchain to Rein in the New Post-Truth World and Check the Spread of Fake News. IT Professional, 21(4), 16–24. https://doi.org/10.1109/mitp.2019.2910503 [6] Dolhansky, B., Bitton, J., Pflaum, B., Lu, J., Howes, R., Wang, M., & Ferrer, C. C. (2020). The DeepFake Detection Challenge Dataset. ArXiv:2006.07397 [Cs]. https://arxiv.org/abs/2006.07397 [7] Fedeli, G. (2019). “Fake news” meets tourism: a proposed research agenda. Annals of Tourism Research, 102684. https://doi.org/10.1016/j.annals.2019.02.002 [8] Kwok, A. O. J., & Koh, S. G. M. (2020). Deepfake: a social construction of technology perspective. Current Issues in Tourism, 24(13), 1–5. https://doi.org/10.1080/13683500.2020.1738357 [9] Dunn, S. (2021, March 3). Women, Not Politicians, Are Targeted Most Often by Deepfake Videos. Centre for International Governance Innovation. https://www.cigionline.org/articles/women-not-politicians-are-targeted-most-often-deepfake-videos/?s=03 [10] Wagner, T. L., & Blewer, A. (2019). “The Word Real Is No Longer Real”: Deepfakes, Gender, and the Challenges of AI-Altered Video. Open Information Science, 3(1), 32–46. https://doi.org/10.1515/opis-2019-0003 [11] Romero-Delgado, M., & María Concepción Fernández-Villanueva. (2024). Narratives, emotions and violence on television: gender attitudes towards human suffering. SN Social Sciences, 4(1). https://doi.org/10.1007/s43545-023-00804-6 [12] Davidson, D. J., & Freudenburg, W. R. (1996). Gender and environmental risk concerns: A review and analysis of available research. Environment and behavior, 28(3), 302-339. [13] Gunther, A. C., & Storey, J. D. (2003). The Influence of Presumed Influence. Journal of Communication, 53(2), 199–215. https://doi.org/10.1111/j.1460-2466.2003.tb02586.x [14] Tal-Or, N., Cohen, J., Tsfati, Y., & Gunther, A. C. (2010). Testing Causal Direction in the Influence of Presumed Media Influence. Communication Research, 37(6), 801–824. https://doi.org/10.1177/0093650210362684 [15] Donsbach, W., & Traugott, M. W. (2008). The SAGE Handbook of Public Opinion Research. [16] Tsfati, Y., & Cohen, J. (2005). The Influence of Presumed Media Influence on Democratic Legitimacy. Communication Research, 32(6), 794–821. https://doi.org/10.1177/0093650205281057 [17] Rosen, L. D., Whaling, K., Carrier, L. M., Cheever, N. A., & Rokkum, J. (2013). The Media and Technology Usage and Attitudes Scale: An empirical investigation. Computers in Human Behavior, 29(6), 2501–2511. https://doi.org/10.1016/j.chb.2013.06.006 [18] Chadha, A., Kumar, V., Kashyap, S., & Gupta, M. (2021). Deepfake: An Overview. Proceedings of Second International Conference on Computing, Communications, and Cyber-Security, 203, 557–566. https://doi.org/10.1007/978-981-16-0733-2_39 [19] SJÖBERG, L. (2003). Risk perception, emotion and policy: the case of nuclear technology. European Review, 11(1), 109–128. https://doi.org/10.1017/s1062798703000127
Copyright @ 2020-2035 Academic Education Publishing House All Rights Reserved