學術產出-Theses

Article View/Open

Publication Export

Google ScholarTM

政大圖書館

Citation Infomation

  • No doi shows Citation Infomation
題名 談話型AI如何扭曲與塑形全新的使用者體驗?
How do biases in conversational artificial intelligence distort and shape a new user experience?
作者 舒天宓
Saquet, Thémis
貢獻者 莊皓鈞
Chuang, Howard
舒天宓
Thémis Saquet
關鍵詞 偏見
對話式人工智能
對話式營銷
用戶體驗
Bias
Conversational Artificial Intelligence
Conversational Marketing
User Experience
日期 2023
上傳時間 6-Jul-2023 16:34:53 (UTC+8)
摘要 In our post-COVID-19 societies, more and more consumers rely on conversational AI such as voice assistants or chatbots to perform any kind of task, from asking for the weather to having personal conversations. Companies have seized on this demand to continue developing their conversational AI to create an ever-better user experience. However, the racial or sexist biases implemented in these AIs distort the original user experience, sometimes creating a new one depending on when the bias is implemented. We will try to analyze the effect of these different biases on the user experience and know how they can distort the user experience, especially depending on the moment when these biases appear. To do so, we will analyze the biases in the case of voice assistants and interactive social chatbots through a case study between XiaoIce and MicrosoftTay. We will analyze the appearance of these biases and their effects using the three-stage framework for artificial intelligence in marketing. The main conclusions are that racial biases, mainly embedded because of insufficiently diverse data and engineers, and gender biases, tend to reinforce the structural inequalities that affect our societies. By reinforcing these inequalities, the user experience is negatively impacted in terms of accessibility, representation, and experience conveyed by the use of the product.
參考文獻 Bibliography
1. De Cosmo, L. (2022) Google Engineer claims AI chatbot is sentient: Why that matters, Scientific American. Scientific American.
Available at: https://www.scientificamerican.com/article/google-engineer-claims-ai-chatbot-is-sentient-why-that-matters/
2. Pourquoi l`IA Conversationnelle ? IBM.
Available at: https://www.ibm.com/fr-fr/topics/conversational-ai#:~:text=L`intelligence%20artificielle%20(IA),auxquelles%20les%20utilisateurs%20peuvent%20parler
3. Le fonctionnement des assistants vocaux en 5 étapes (2020) CNIL. Commission nationale de l`informatique et des libertés.
Available at: https://www.cnil.fr/fr/le-fonctionnement-des-assistants-vocaux-en-5-etapes#:~:text=Un%20assistant%20vocal%20est%20un,la%20requ%C3%AAte%20d`un%20utilisateur
4. Qu`est-Ce Qu`un Biais Cognitif ? Définition Biais cognitif (2023) USABILIS.
Available at: https://www.usabilis.com/definition-biais-cognitifs/
5. Spradlin, L.K. (2012) Diversity matters: Understanding diversity in schools. Belmont, CA: Wadsworth, Cengage Learning.
6. Braswell, P. (2022) This is the difference between racism and racial bias, Fast Company. Available at: https://www.fastcompany.com/90796690/this-is-the-difference-between-racism-and-racial-bias
7. Eliminating gender bias in conversational AI: Strategies for fair and Inclusive Conversations (2023) Aivo.
Available at: https://www.aivo.co/blog/eliminating-gender-bias-in-conversational-ai-strategies-for-fair-and-inclusive-conversations
8. User experience (2020) What is User Experience? | Definition and Overview.
Available at: https://www.productplan.com/glossary/user-experience/
9. What is user experience (UX) design? (2022) The Interaction Design Foundation. Interaction Design Foundation.
Available at: https://www.interaction-design.org/literature/topics/ux-design
10. Huang, M.-H. and Rust, R.T. (2020) A strategic framework for artificial intelligence in marketing - journal of the Academy of Marketing Science, SpringerLink. Springer US. Available at: https://link.springer.com/article/10.1007/s11747-020-00749-9
11. Edison Research (2020) The smart audio report 2020 from NPR and Edison Research, Edison Research. Edison Research http://www.edisonresearch.com/wp-content/uploads/2014/06/edison-logo-300x137.jpg.
Available at: https://www.edisonresearch.com/the-smart-audio-report-2020-from-npr-and-edison-research/
12. Transcript: Ai from above (EP3) (2022) The Internet Health Report 2022.
Available at: https://2022.internethealthreport.org/transcript-ai-from-above-ep3/
13. Harwell, D. (2018) The accent gap: How Amazon`s and Google`s smart speakers leave certain voices behind, The Washington Post. WP Company.
Available at: https://www.washingtonpost.com/graphics/2018/business/alexa-does-not-understand-your-accent/
14. Sidnell, J. (2019). African American Vernacular English. [online] Hawaii.edu.
Available at: https://www.hawaii.edu/satocenter/langnet/definitions/aave.html.
15. Koenecke, A. et al. (2020) Racial disparities in automated speech recognition | PNAS, Proceedings of the National Academy of Sciences, PNAS.
Available at: https://www.pnas.org/doi/10.1073/pnas.1915768117
16. Online resources for African American language (no date) CORAAL | Online Resources for African American Language.
Available at: https://oraal.uoregon.edu/coraal
17. Lloreda, C.L. (2020). Speech Recognition Tech Is Yet Another Example of Bias.
Scientific American.
Available at: https://www.scientificamerican.com/article/speech-recognition-tech-is-yet-another-example-of-bias/

18. Cantone, J.A., Martinez, L.N., Willis-Esqueda, C. and Miller, T. (2019). Sounding guilty:
How accent bias affects juror judgments of culpability. Journal of Ethnicity in Criminal Justice, 17(3), pp.228–253.
doi:https://doi.org/10.1080/15377938.2019.1623963

19. Baquiran, C.L.C. and Nicoladis, E. (2019). A Doctor’s Foreign Accent Affects
Perceptions of Competence. Health Communication, 35(6), pp.726–730. doi:https://doi.org/10.1080/10410236.2019.1584779

20. Baraniuk, C. (2022). Why your voice assistant might be sexist. [online] www.bbc.com.
Available at: https://www.bbc.com/future/article/20220614-why-your-voice-assistant-might-be-sexist

21. CNN, B.B.G. (n.d.). Why computer voices are mostly female. [online] CNN.
Available at: https://edition.cnn.com/2011/10/21/tech/innovation/female-computer-voices/

22. Gizmodo. (n.d.). No, Women’s Voices Are Not Easier to Understand Than Men’s Voices.
[online]
Available at: https://gizmodo.com/no-siri-is-not-female-because-womens-voices-are-easier-1683901643

23. Halo· (2023). How many copies did Halo sell? — 2023 statistics | LEVVVEL. [online]
levvvel.com.
Available at: https://levvvel.com/halo-statistics/#:~:text=The%20Halo%20series%20has%20sold%2081%20million%20copies.&text=The%20series%20went%20from%2065

24. Franceinfo. (2020). Nouveau monde. Pourquoi les prénoms de Siri et d’Alexa pour des
assistants vocaux ? [online]
Available at: https://www.francetvinfo.fr/replay-radio/nouveau-monde/nouveau-monde-pourquoi-les-prenoms-de-siri-et-d-alexa-pour-des-assistants-vocaux_4045705.html

25. Gizmodo. (2015). Why Is My Digital Assistant So Creepy? [online]
Available at: https://gizmodo.com/why-is-my-digital-assistant-so-creepy-1682216423

26. World Economic Forum. (n.d.). Hey Siri, you’re sexist, finds UN report on gendered
technology. [online]
Available at: https://www.weforum.org/agenda/2019/05/hey-siri-youre-sexist-finds-u-n-report-on-gendered-technology/

27. www.ibm.com. (n.d.). Qu’est-ce qu’un agent conversationnel ? | IBM. [online]
Available at: https://www.ibm.com/fr-fr/topics/chatbots

28., 29. & 30. Salesforce.com. (n.d.). State of the Connected Customer Report. [online]
Available at: https://www.salesforce.com/resources/research-reports/state-of-the-connected-customer/.

31. Drift. (2021). 2021 State of Conversational Marketing. [online]
Available at: https://www.drift.com/books-reports/conversational-marketing-trends/#state-of-convo-marketing.

32. Weizenbaum, J. (1966). ELIZA---a computer program for the study of natural language
communication between man and machine. Communications of the ACM, [online] 9(1), pp.36–45.
doi:https://doi.org/10.1145/365153.365168.

33. Colby, K.M., Weber, S. and Hilf, F.D. (1971). Artificial Paranoia. Artificial Intelligence,
2(1), pp.1–25.
doi:https://doi.org/10.1016/0004-3702(71)90002-6.

34. Wallace, R.S. (2007). The Anatomy of A.L.I.C.E. Parsing the Turing Test, [online]
pp.181–210.
doi:https://doi.org/10.1007/978-1-4020-6710-5_13.

35. Zhou, L., Gao, J., Li, D. and Shum, H.-Y. (2020). The Design and Implementation of
XiaoIce, an Empathetic Social Chatbot. Computational Linguistics, 46(1), pp.53–93.
doi:https://doi.org/10.1162/coli_a_00368.

36. Quach, K. (n.d.). Microsoft chatbots: Sweet XiaoIce vs foul-mouthed Tay. [online]
www.theregister.com.
Available at: https://www.theregister.com/2016/09/29/microsofts_chatbots_show_cultural_differences_between_the_east_and_west/

37. Microsoft chatbot is taught to swear on Twitter. (2016). BBC News. [online] 24 Mar.
Available at: https://www.bbc.com/news/technology-35890188.

38. Hunt, E. (2018). Tay, Microsoft’s AI chatbot, gets a crash course in racism from Twitter.
[online] The Guardian.
Available at: https://www.theguardian.com/technology/2016/mar/24/tay-microsofts-ai-chatbot-gets-a-crash-course-in-racism-from-twitter.

39. Explains, K. (2020). Remembering Microsoft’s Chatbot disaster. [online] Medium.
Available at: https://uxplanet.org/remembering-microsofts-chatbot-disaster-3a49d4a6331f

40. Editor (2016). XiaoIce Vs. Tay: Two A.I. Chatbots, Two Different Outcomes - Sampi. co.
[online]
Available at: https://sampi.co/chinese-chatbot-xiaoice-vs-tay/

41. & 42. Miller, K.W., Wolf, M.J. and Grodzinsky, F.S. (2017). Why We Should Have Seen
That Coming. ORBIT Journal, 1(2).
doi:https://doi.org/10.29297/orbit.v1i2.49

43. Leetaru, K. (n.d.). How Twitter Corrupted Microsoft’s Tay: A Crash Course In the
Dangers Of AI In The Real World. [online] Forbes.
Available at: https://www.forbes.com/sites/kalevleetaru/2016/03/24/how-twitter-corrupted-microsofts-tay-a-crash-course-in-the-dangers-of-ai-in-the-real-world/?sh=ffeebf926d28

44. https://plus.google.com/+UNESCO (2021). Problème de fille : briser les préjugés dans
l’IA. [online] UNESCO.
Available at: https://fr.unesco.org/girltrouble

45. Zendesk. (n.d.). 7 ways to reduce bias in conversational AI. [online]
Available at: https://www.zendesk.tw/blog/7-ways-reduce-bias-conversational-ai/#georedirect

46. SoundHound. (n.d.). How to Overcome Cultural Bias in Voice AI Design. [online]
Available at: https://www.soundhound.com/resources/how-to-overcome-cultural-bias-in-voice-ai-design/
47. Dotan, R. (2023) Example of gender bias in CHATGPT, Ravit Dotan. Available at: https://www.ravitdotan.com/post/example-of-gender-bias-in-chatgpt
48. Shafik and Case (eds.) (2022) Advances in manufacturing technology XXXV, IOS Press. Available at: https://www.iospress.com/catalog/books/advances-in-manufacturing-technology-xxxv
49. Kotek, H. (2023) Doctors can’t get pregnant and other gender biases in chatgpt, Doctors can’t get pregnant and other gender biases in ChatGPT. Available at: https://hkotek.com/blog/gender-bias-in-chatgpt/

Additional resources

1. Wheeler, D.R. (2014). Why are Cortana and Siri female? [online] CNN.
Available at: https://edition.cnn.com/2014/04/04/opinion/wheeler-cortana-siri/

2. Abby Ohlheiser, covering digital culture (n.d.). Trolls turned Tay, Microsoft’s fun millennial AI bot, into a genocidal maniac. [online] Washington Post.
Available at: https://www.washingtonpost.com/news/the-intersect/wp/2016/03/24/the-internet-turned-tay-microsofts-fun-millennial-ai-bot-into-a-genocidal-maniac/

3. Mathur, V., Stavrakas, Y. and Singh, S. (2016). Intelligence analysis of Tay Twitter bot. [online] ResearchGate.
Available at: https://www.researchgate.net/publication/316727714_Intelligence_
analysis_of_Tay_Twitter_bot

4. Nouri, S. (n.d.). Council Post: The Role Of Bias In Artificial Intelligence. [online] Forbes.
Available at: https://www.forbes.com/sites/forbestechcouncil/2021/02/04/the-role-of-bias-in-artificial-intelligence/?sh=3d43f9f7579d

5. UN News. (2019). Are robots sexist? UN report shows gender bias in talking digital tech.
[online]
Available at: https://news.un.org/en/story/2019/05/1038691

6. West, M., Kraut, R. and Chew, H.E. (2019). I’d blush if I could: closing gender divides in
digital skills through education.
Available at: https://unesdoc.unesco.org/ark:/48223/pf0000367416.page=1
描述 碩士
國立政治大學
國際經營管理英語碩士學位學程(IMBA)
111933056
資料來源 http://thesis.lib.nccu.edu.tw/record/#G0111933056
資料類型 thesis
dc.contributor.advisor 莊皓鈞zh_TW
dc.contributor.advisor Chuang, Howarden_US
dc.contributor.author (Authors) 舒天宓zh_TW
dc.contributor.author (Authors) Thémis Saqueten_US
dc.creator (作者) 舒天宓zh_TW
dc.creator (作者) Saquet, Thémisen_US
dc.date (日期) 2023en_US
dc.date.accessioned 6-Jul-2023 16:34:53 (UTC+8)-
dc.date.available 6-Jul-2023 16:34:53 (UTC+8)-
dc.date.issued (上傳時間) 6-Jul-2023 16:34:53 (UTC+8)-
dc.identifier (Other Identifiers) G0111933056en_US
dc.identifier.uri (URI) http://nccur.lib.nccu.edu.tw/handle/140.119/145809-
dc.description (描述) 碩士zh_TW
dc.description (描述) 國立政治大學zh_TW
dc.description (描述) 國際經營管理英語碩士學位學程(IMBA)zh_TW
dc.description (描述) 111933056zh_TW
dc.description.abstract (摘要) In our post-COVID-19 societies, more and more consumers rely on conversational AI such as voice assistants or chatbots to perform any kind of task, from asking for the weather to having personal conversations. Companies have seized on this demand to continue developing their conversational AI to create an ever-better user experience. However, the racial or sexist biases implemented in these AIs distort the original user experience, sometimes creating a new one depending on when the bias is implemented. We will try to analyze the effect of these different biases on the user experience and know how they can distort the user experience, especially depending on the moment when these biases appear. To do so, we will analyze the biases in the case of voice assistants and interactive social chatbots through a case study between XiaoIce and MicrosoftTay. We will analyze the appearance of these biases and their effects using the three-stage framework for artificial intelligence in marketing. The main conclusions are that racial biases, mainly embedded because of insufficiently diverse data and engineers, and gender biases, tend to reinforce the structural inequalities that affect our societies. By reinforcing these inequalities, the user experience is negatively impacted in terms of accessibility, representation, and experience conveyed by the use of the product.en_US
dc.description.tableofcontents 1 INTRODUCTION 1
2 VOICE ASSISTANTS 8
2.1 RACIAL BIAS IN VOICE ASSISTANTS 9
2.1.1 Marketing Research: How the critical step of data collection may influence the user experience from the very conception of the product 9
2.1.2 Marketing Strategy: How bias becomes a barrier to relevant positioning 14
2.1.3 An empirical measurement of biases to be corrected 16
2.1.4 Marketing Action: How a greater personalization through more diverse data and workforce is the way to go for companies 19
2.2 GENDER BIAS IN VOICE ASSISTANTS 21
2.2.1 Marketing Research and Strategy: How has a persisting historical bias made its way to these critical steps around the user experience? 21
2.2.2 Marketing Action: How better marketing practices can help stop these everlasting stereotypes? 27
3 CHATBOT 29
3.1 CONVERSATIONAL MARKETING 30
3.1.1 The benefits of unbiased conversational marketing 32
3.1.2 What strategies are used for conversational marketing? 39
3.1.3 Creating a bias-free user experience thanks to conversational marketing by leveraging AI in the different stages of marketing 40
3.1.3.1 Marketing Research 40
3.1.3.2 Marketing Strategy 42
3.1.3.3 Marketing Action 43
3.2 THE STUDY CASE OF INTERACTIVE SOCIAL CHATBOTS: HOW LETTING USERS IMPLEMENT THOSE BIASES CAN LEAD TO A TERRIBLE USER EXPERIENCE 44
4 CONCLUSION 52
5 BIBLIOGRAPHY 56
6 ADDITIONAL RESOURCES 61
zh_TW
dc.format.extent 2705327 bytes-
dc.format.mimetype application/pdf-
dc.source.uri (資料來源) http://thesis.lib.nccu.edu.tw/record/#G0111933056en_US
dc.subject (關鍵詞) 偏見zh_TW
dc.subject (關鍵詞) 對話式人工智能zh_TW
dc.subject (關鍵詞) 對話式營銷zh_TW
dc.subject (關鍵詞) 用戶體驗zh_TW
dc.subject (關鍵詞) Biasen_US
dc.subject (關鍵詞) Conversational Artificial Intelligenceen_US
dc.subject (關鍵詞) Conversational Marketingen_US
dc.subject (關鍵詞) User Experienceen_US
dc.title (題名) 談話型AI如何扭曲與塑形全新的使用者體驗?zh_TW
dc.title (題名) How do biases in conversational artificial intelligence distort and shape a new user experience?en_US
dc.type (資料類型) thesisen_US
dc.relation.reference (參考文獻) Bibliography
1. De Cosmo, L. (2022) Google Engineer claims AI chatbot is sentient: Why that matters, Scientific American. Scientific American.
Available at: https://www.scientificamerican.com/article/google-engineer-claims-ai-chatbot-is-sentient-why-that-matters/
2. Pourquoi l`IA Conversationnelle ? IBM.
Available at: https://www.ibm.com/fr-fr/topics/conversational-ai#:~:text=L`intelligence%20artificielle%20(IA),auxquelles%20les%20utilisateurs%20peuvent%20parler
3. Le fonctionnement des assistants vocaux en 5 étapes (2020) CNIL. Commission nationale de l`informatique et des libertés.
Available at: https://www.cnil.fr/fr/le-fonctionnement-des-assistants-vocaux-en-5-etapes#:~:text=Un%20assistant%20vocal%20est%20un,la%20requ%C3%AAte%20d`un%20utilisateur
4. Qu`est-Ce Qu`un Biais Cognitif ? Définition Biais cognitif (2023) USABILIS.
Available at: https://www.usabilis.com/definition-biais-cognitifs/
5. Spradlin, L.K. (2012) Diversity matters: Understanding diversity in schools. Belmont, CA: Wadsworth, Cengage Learning.
6. Braswell, P. (2022) This is the difference between racism and racial bias, Fast Company. Available at: https://www.fastcompany.com/90796690/this-is-the-difference-between-racism-and-racial-bias
7. Eliminating gender bias in conversational AI: Strategies for fair and Inclusive Conversations (2023) Aivo.
Available at: https://www.aivo.co/blog/eliminating-gender-bias-in-conversational-ai-strategies-for-fair-and-inclusive-conversations
8. User experience (2020) What is User Experience? | Definition and Overview.
Available at: https://www.productplan.com/glossary/user-experience/
9. What is user experience (UX) design? (2022) The Interaction Design Foundation. Interaction Design Foundation.
Available at: https://www.interaction-design.org/literature/topics/ux-design
10. Huang, M.-H. and Rust, R.T. (2020) A strategic framework for artificial intelligence in marketing - journal of the Academy of Marketing Science, SpringerLink. Springer US. Available at: https://link.springer.com/article/10.1007/s11747-020-00749-9
11. Edison Research (2020) The smart audio report 2020 from NPR and Edison Research, Edison Research. Edison Research http://www.edisonresearch.com/wp-content/uploads/2014/06/edison-logo-300x137.jpg.
Available at: https://www.edisonresearch.com/the-smart-audio-report-2020-from-npr-and-edison-research/
12. Transcript: Ai from above (EP3) (2022) The Internet Health Report 2022.
Available at: https://2022.internethealthreport.org/transcript-ai-from-above-ep3/
13. Harwell, D. (2018) The accent gap: How Amazon`s and Google`s smart speakers leave certain voices behind, The Washington Post. WP Company.
Available at: https://www.washingtonpost.com/graphics/2018/business/alexa-does-not-understand-your-accent/
14. Sidnell, J. (2019). African American Vernacular English. [online] Hawaii.edu.
Available at: https://www.hawaii.edu/satocenter/langnet/definitions/aave.html.
15. Koenecke, A. et al. (2020) Racial disparities in automated speech recognition | PNAS, Proceedings of the National Academy of Sciences, PNAS.
Available at: https://www.pnas.org/doi/10.1073/pnas.1915768117
16. Online resources for African American language (no date) CORAAL | Online Resources for African American Language.
Available at: https://oraal.uoregon.edu/coraal
17. Lloreda, C.L. (2020). Speech Recognition Tech Is Yet Another Example of Bias.
Scientific American.
Available at: https://www.scientificamerican.com/article/speech-recognition-tech-is-yet-another-example-of-bias/

18. Cantone, J.A., Martinez, L.N., Willis-Esqueda, C. and Miller, T. (2019). Sounding guilty:
How accent bias affects juror judgments of culpability. Journal of Ethnicity in Criminal Justice, 17(3), pp.228–253.
doi:https://doi.org/10.1080/15377938.2019.1623963

19. Baquiran, C.L.C. and Nicoladis, E. (2019). A Doctor’s Foreign Accent Affects
Perceptions of Competence. Health Communication, 35(6), pp.726–730. doi:https://doi.org/10.1080/10410236.2019.1584779

20. Baraniuk, C. (2022). Why your voice assistant might be sexist. [online] www.bbc.com.
Available at: https://www.bbc.com/future/article/20220614-why-your-voice-assistant-might-be-sexist

21. CNN, B.B.G. (n.d.). Why computer voices are mostly female. [online] CNN.
Available at: https://edition.cnn.com/2011/10/21/tech/innovation/female-computer-voices/

22. Gizmodo. (n.d.). No, Women’s Voices Are Not Easier to Understand Than Men’s Voices.
[online]
Available at: https://gizmodo.com/no-siri-is-not-female-because-womens-voices-are-easier-1683901643

23. Halo· (2023). How many copies did Halo sell? — 2023 statistics | LEVVVEL. [online]
levvvel.com.
Available at: https://levvvel.com/halo-statistics/#:~:text=The%20Halo%20series%20has%20sold%2081%20million%20copies.&text=The%20series%20went%20from%2065

24. Franceinfo. (2020). Nouveau monde. Pourquoi les prénoms de Siri et d’Alexa pour des
assistants vocaux ? [online]
Available at: https://www.francetvinfo.fr/replay-radio/nouveau-monde/nouveau-monde-pourquoi-les-prenoms-de-siri-et-d-alexa-pour-des-assistants-vocaux_4045705.html

25. Gizmodo. (2015). Why Is My Digital Assistant So Creepy? [online]
Available at: https://gizmodo.com/why-is-my-digital-assistant-so-creepy-1682216423

26. World Economic Forum. (n.d.). Hey Siri, you’re sexist, finds UN report on gendered
technology. [online]
Available at: https://www.weforum.org/agenda/2019/05/hey-siri-youre-sexist-finds-u-n-report-on-gendered-technology/

27. www.ibm.com. (n.d.). Qu’est-ce qu’un agent conversationnel ? | IBM. [online]
Available at: https://www.ibm.com/fr-fr/topics/chatbots

28., 29. & 30. Salesforce.com. (n.d.). State of the Connected Customer Report. [online]
Available at: https://www.salesforce.com/resources/research-reports/state-of-the-connected-customer/.

31. Drift. (2021). 2021 State of Conversational Marketing. [online]
Available at: https://www.drift.com/books-reports/conversational-marketing-trends/#state-of-convo-marketing.

32. Weizenbaum, J. (1966). ELIZA---a computer program for the study of natural language
communication between man and machine. Communications of the ACM, [online] 9(1), pp.36–45.
doi:https://doi.org/10.1145/365153.365168.

33. Colby, K.M., Weber, S. and Hilf, F.D. (1971). Artificial Paranoia. Artificial Intelligence,
2(1), pp.1–25.
doi:https://doi.org/10.1016/0004-3702(71)90002-6.

34. Wallace, R.S. (2007). The Anatomy of A.L.I.C.E. Parsing the Turing Test, [online]
pp.181–210.
doi:https://doi.org/10.1007/978-1-4020-6710-5_13.

35. Zhou, L., Gao, J., Li, D. and Shum, H.-Y. (2020). The Design and Implementation of
XiaoIce, an Empathetic Social Chatbot. Computational Linguistics, 46(1), pp.53–93.
doi:https://doi.org/10.1162/coli_a_00368.

36. Quach, K. (n.d.). Microsoft chatbots: Sweet XiaoIce vs foul-mouthed Tay. [online]
www.theregister.com.
Available at: https://www.theregister.com/2016/09/29/microsofts_chatbots_show_cultural_differences_between_the_east_and_west/

37. Microsoft chatbot is taught to swear on Twitter. (2016). BBC News. [online] 24 Mar.
Available at: https://www.bbc.com/news/technology-35890188.

38. Hunt, E. (2018). Tay, Microsoft’s AI chatbot, gets a crash course in racism from Twitter.
[online] The Guardian.
Available at: https://www.theguardian.com/technology/2016/mar/24/tay-microsofts-ai-chatbot-gets-a-crash-course-in-racism-from-twitter.

39. Explains, K. (2020). Remembering Microsoft’s Chatbot disaster. [online] Medium.
Available at: https://uxplanet.org/remembering-microsofts-chatbot-disaster-3a49d4a6331f

40. Editor (2016). XiaoIce Vs. Tay: Two A.I. Chatbots, Two Different Outcomes - Sampi. co.
[online]
Available at: https://sampi.co/chinese-chatbot-xiaoice-vs-tay/

41. & 42. Miller, K.W., Wolf, M.J. and Grodzinsky, F.S. (2017). Why We Should Have Seen
That Coming. ORBIT Journal, 1(2).
doi:https://doi.org/10.29297/orbit.v1i2.49

43. Leetaru, K. (n.d.). How Twitter Corrupted Microsoft’s Tay: A Crash Course In the
Dangers Of AI In The Real World. [online] Forbes.
Available at: https://www.forbes.com/sites/kalevleetaru/2016/03/24/how-twitter-corrupted-microsofts-tay-a-crash-course-in-the-dangers-of-ai-in-the-real-world/?sh=ffeebf926d28

44. https://plus.google.com/+UNESCO (2021). Problème de fille : briser les préjugés dans
l’IA. [online] UNESCO.
Available at: https://fr.unesco.org/girltrouble

45. Zendesk. (n.d.). 7 ways to reduce bias in conversational AI. [online]
Available at: https://www.zendesk.tw/blog/7-ways-reduce-bias-conversational-ai/#georedirect

46. SoundHound. (n.d.). How to Overcome Cultural Bias in Voice AI Design. [online]
Available at: https://www.soundhound.com/resources/how-to-overcome-cultural-bias-in-voice-ai-design/
47. Dotan, R. (2023) Example of gender bias in CHATGPT, Ravit Dotan. Available at: https://www.ravitdotan.com/post/example-of-gender-bias-in-chatgpt
48. Shafik and Case (eds.) (2022) Advances in manufacturing technology XXXV, IOS Press. Available at: https://www.iospress.com/catalog/books/advances-in-manufacturing-technology-xxxv
49. Kotek, H. (2023) Doctors can’t get pregnant and other gender biases in chatgpt, Doctors can’t get pregnant and other gender biases in ChatGPT. Available at: https://hkotek.com/blog/gender-bias-in-chatgpt/

Additional resources

1. Wheeler, D.R. (2014). Why are Cortana and Siri female? [online] CNN.
Available at: https://edition.cnn.com/2014/04/04/opinion/wheeler-cortana-siri/

2. Abby Ohlheiser, covering digital culture (n.d.). Trolls turned Tay, Microsoft’s fun millennial AI bot, into a genocidal maniac. [online] Washington Post.
Available at: https://www.washingtonpost.com/news/the-intersect/wp/2016/03/24/the-internet-turned-tay-microsofts-fun-millennial-ai-bot-into-a-genocidal-maniac/

3. Mathur, V., Stavrakas, Y. and Singh, S. (2016). Intelligence analysis of Tay Twitter bot. [online] ResearchGate.
Available at: https://www.researchgate.net/publication/316727714_Intelligence_
analysis_of_Tay_Twitter_bot

4. Nouri, S. (n.d.). Council Post: The Role Of Bias In Artificial Intelligence. [online] Forbes.
Available at: https://www.forbes.com/sites/forbestechcouncil/2021/02/04/the-role-of-bias-in-artificial-intelligence/?sh=3d43f9f7579d

5. UN News. (2019). Are robots sexist? UN report shows gender bias in talking digital tech.
[online]
Available at: https://news.un.org/en/story/2019/05/1038691

6. West, M., Kraut, R. and Chew, H.E. (2019). I’d blush if I could: closing gender divides in
digital skills through education.
Available at: https://unesdoc.unesco.org/ark:/48223/pf0000367416.page=1
zh_TW