Algorithmic Bias and Women's Privacy: A Comparative Study of USA, China, and the UK
- DOI
- 10.2991/978-2-38476-515-7_23How to use a DOI?
- Keywords
- Algorithmic Bias; Digital Surveillance; Digital Privacy; Artificial Intelligence; Privacy Rights
- Abstract
Artificial intelligence, although part of the fourth industrial revolution and good for the current societies, poses big problems in terms of algorithmic bias and the right to privacy of women. The advancement of AI in the US is prejudiced to a great extent in favour of corporate interests, to the detriment of which in most cases privacy, which leads to discriminatory outcomes in the areas of employment, health, and on-line marketing. The Chinese state-led AI ecosystem which is characterized by big-screen. The digital surveillance and data collection of scales generates special privacy problems to females in an extremely policed cyber space. Even with highly developed regulatory, e.g. GDPR, the UK has not managed to address the intersectional AI and feminine concerns of privacy, on many layers. This place of crossroad is between three jurisdictions, and they are the United States, China, and the United. There are three forms of regulation, referred to as kingdom. The study highlights the relevance of the need to implement AI laws that include gender in these legal jurisdictions to protect the rights of women to online privacy. The comparative analysis brings in the role of the different socio-political environments on the substance of AI application and its impacts and demonstrates the necessity to take An equilibrium that helps in the promotion of innovation and it respects the privacy rights.
- Copyright
- © 2025 The Author(s)
- Open Access
- Open Access This chapter is licensed under the terms of the Creative Commons Attribution-NonCommercial 4.0 International License (http://creativecommons.org/licenses/by-nc/4.0/), which permits any noncommercial use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license and indicate if changes were made.
Cite this article
TY - CONF AU - Ishika Gautam AU - Rajeev Kumar Singh PY - 2025 DA - 2025/12/26 TI - Algorithmic Bias and Women's Privacy: A Comparative Study of USA, China, and the UK BT - Proceedings of the International Conference on Law and Technology (ICLT 2025) PB - Atlantis Press SP - 253 EP - 262 SN - 2352-5398 UR - https://doi.org/10.2991/978-2-38476-515-7_23 DO - 10.2991/978-2-38476-515-7_23 ID - Gautam2025 ER -