學術產出-Theses
Article View/Open
Publication Export
-
題名 談話型AI如何扭曲與塑形全新的使用者體驗?
How do biases in conversational artificial intelligence distort and shape a new user experience?作者 舒天宓
Saquet, Thémis貢獻者 莊皓鈞
Chuang, Howard
舒天宓
Thémis Saquet關鍵詞 偏見
對話式人工智能
對話式營銷
用戶體驗
Bias
Conversational Artificial Intelligence
Conversational Marketing
User Experience日期 2023 上傳時間 6-Jul-2023 16:34:53 (UTC+8) 摘要 In our post-COVID-19 societies, more and more consumers rely on conversational AI such as voice assistants or chatbots to perform any kind of task, from asking for the weather to having personal conversations. Companies have seized on this demand to continue developing their conversational AI to create an ever-better user experience. However, the racial or sexist biases implemented in these AIs distort the original user experience, sometimes creating a new one depending on when the bias is implemented. We will try to analyze the effect of these different biases on the user experience and know how they can distort the user experience, especially depending on the moment when these biases appear. To do so, we will analyze the biases in the case of voice assistants and interactive social chatbots through a case study between XiaoIce and MicrosoftTay. We will analyze the appearance of these biases and their effects using the three-stage framework for artificial intelligence in marketing. The main conclusions are that racial biases, mainly embedded because of insufficiently diverse data and engineers, and gender biases, tend to reinforce the structural inequalities that affect our societies. By reinforcing these inequalities, the user experience is negatively impacted in terms of accessibility, representation, and experience conveyed by the use of the product. 參考文獻 Bibliography1. De Cosmo, L. (2022) Google Engineer claims AI chatbot is sentient: Why that matters, Scientific American. Scientific American.Available at: https://www.scientificamerican.com/article/google-engineer-claims-ai-chatbot-is-sentient-why-that-matters/2. Pourquoi l`IA Conversationnelle ? IBM.Available at: https://www.ibm.com/fr-fr/topics/conversational-ai#:~:text=L`intelligence%20artificielle%20(IA),auxquelles%20les%20utilisateurs%20peuvent%20parler3. Le fonctionnement des assistants vocaux en 5 étapes (2020) CNIL. Commission nationale de l`informatique et des libertés.Available at: https://www.cnil.fr/fr/le-fonctionnement-des-assistants-vocaux-en-5-etapes#:~:text=Un%20assistant%20vocal%20est%20un,la%20requ%C3%AAte%20d`un%20utilisateur4. Qu`est-Ce Qu`un Biais Cognitif ? Définition Biais cognitif (2023) USABILIS.Available at: https://www.usabilis.com/definition-biais-cognitifs/5. Spradlin, L.K. (2012) Diversity matters: Understanding diversity in schools. Belmont, CA: Wadsworth, Cengage Learning.6. Braswell, P. (2022) This is the difference between racism and racial bias, Fast Company. Available at: https://www.fastcompany.com/90796690/this-is-the-difference-between-racism-and-racial-bias7. Eliminating gender bias in conversational AI: Strategies for fair and Inclusive Conversations (2023) Aivo.Available at: https://www.aivo.co/blog/eliminating-gender-bias-in-conversational-ai-strategies-for-fair-and-inclusive-conversations8. User experience (2020) What is User Experience? | Definition and Overview.Available at: https://www.productplan.com/glossary/user-experience/9. What is user experience (UX) design? (2022) The Interaction Design Foundation. Interaction Design Foundation.Available at: https://www.interaction-design.org/literature/topics/ux-design10. Huang, M.-H. and Rust, R.T. (2020) A strategic framework for artificial intelligence in marketing - journal of the Academy of Marketing Science, SpringerLink. Springer US. Available at: https://link.springer.com/article/10.1007/s11747-020-00749-911. Edison Research (2020) The smart audio report 2020 from NPR and Edison Research, Edison Research. Edison Research http://www.edisonresearch.com/wp-content/uploads/2014/06/edison-logo-300x137.jpg.Available at: https://www.edisonresearch.com/the-smart-audio-report-2020-from-npr-and-edison-research/12. Transcript: Ai from above (EP3) (2022) The Internet Health Report 2022.Available at: https://2022.internethealthreport.org/transcript-ai-from-above-ep3/13. Harwell, D. (2018) The accent gap: How Amazon`s and Google`s smart speakers leave certain voices behind, The Washington Post. WP Company.Available at: https://www.washingtonpost.com/graphics/2018/business/alexa-does-not-understand-your-accent/14. Sidnell, J. (2019). African American Vernacular English. [online] Hawaii.edu.Available at: https://www.hawaii.edu/satocenter/langnet/definitions/aave.html.15. Koenecke, A. et al. (2020) Racial disparities in automated speech recognition | PNAS, Proceedings of the National Academy of Sciences, PNAS.Available at: https://www.pnas.org/doi/10.1073/pnas.191576811716. Online resources for African American language (no date) CORAAL | Online Resources for African American Language.Available at: https://oraal.uoregon.edu/coraal17. Lloreda, C.L. (2020). Speech Recognition Tech Is Yet Another Example of Bias.Scientific American.Available at: https://www.scientificamerican.com/article/speech-recognition-tech-is-yet-another-example-of-bias/18. Cantone, J.A., Martinez, L.N., Willis-Esqueda, C. and Miller, T. (2019). Sounding guilty:How accent bias affects juror judgments of culpability. Journal of Ethnicity in Criminal Justice, 17(3), pp.228–253.doi:https://doi.org/10.1080/15377938.2019.162396319. Baquiran, C.L.C. and Nicoladis, E. (2019). A Doctor’s Foreign Accent AffectsPerceptions of Competence. Health Communication, 35(6), pp.726–730. doi:https://doi.org/10.1080/10410236.2019.158477920. Baraniuk, C. (2022). Why your voice assistant might be sexist. [online] www.bbc.com.Available at: https://www.bbc.com/future/article/20220614-why-your-voice-assistant-might-be-sexist21. CNN, B.B.G. (n.d.). Why computer voices are mostly female. [online] CNN.Available at: https://edition.cnn.com/2011/10/21/tech/innovation/female-computer-voices/22. Gizmodo. (n.d.). No, Women’s Voices Are Not Easier to Understand Than Men’s Voices.[online]Available at: https://gizmodo.com/no-siri-is-not-female-because-womens-voices-are-easier-168390164323. Halo· (2023). How many copies did Halo sell? — 2023 statistics | LEVVVEL. [online]levvvel.com.Available at: https://levvvel.com/halo-statistics/#:~:text=The%20Halo%20series%20has%20sold%2081%20million%20copies.&text=The%20series%20went%20from%206524. Franceinfo. (2020). Nouveau monde. Pourquoi les prénoms de Siri et d’Alexa pour desassistants vocaux ? [online]Available at: https://www.francetvinfo.fr/replay-radio/nouveau-monde/nouveau-monde-pourquoi-les-prenoms-de-siri-et-d-alexa-pour-des-assistants-vocaux_4045705.html25. Gizmodo. (2015). Why Is My Digital Assistant So Creepy? [online]Available at: https://gizmodo.com/why-is-my-digital-assistant-so-creepy-168221642326. World Economic Forum. (n.d.). Hey Siri, you’re sexist, finds UN report on genderedtechnology. [online]Available at: https://www.weforum.org/agenda/2019/05/hey-siri-youre-sexist-finds-u-n-report-on-gendered-technology/27. www.ibm.com. (n.d.). Qu’est-ce qu’un agent conversationnel ? | IBM. [online]Available at: https://www.ibm.com/fr-fr/topics/chatbots28., 29. & 30. Salesforce.com. (n.d.). State of the Connected Customer Report. [online]Available at: https://www.salesforce.com/resources/research-reports/state-of-the-connected-customer/.31. Drift. (2021). 2021 State of Conversational Marketing. [online]Available at: https://www.drift.com/books-reports/conversational-marketing-trends/#state-of-convo-marketing.32. Weizenbaum, J. (1966). ELIZA---a computer program for the study of natural languagecommunication between man and machine. Communications of the ACM, [online] 9(1), pp.36–45.doi:https://doi.org/10.1145/365153.365168.33. Colby, K.M., Weber, S. and Hilf, F.D. (1971). Artificial Paranoia. Artificial Intelligence,2(1), pp.1–25.doi:https://doi.org/10.1016/0004-3702(71)90002-6.34. Wallace, R.S. (2007). The Anatomy of A.L.I.C.E. Parsing the Turing Test, [online]pp.181–210.doi:https://doi.org/10.1007/978-1-4020-6710-5_13.35. Zhou, L., Gao, J., Li, D. and Shum, H.-Y. (2020). The Design and Implementation ofXiaoIce, an Empathetic Social Chatbot. Computational Linguistics, 46(1), pp.53–93.doi:https://doi.org/10.1162/coli_a_00368.36. Quach, K. (n.d.). Microsoft chatbots: Sweet XiaoIce vs foul-mouthed Tay. [online]www.theregister.com.Available at: https://www.theregister.com/2016/09/29/microsofts_chatbots_show_cultural_differences_between_the_east_and_west/37. Microsoft chatbot is taught to swear on Twitter. (2016). BBC News. [online] 24 Mar.Available at: https://www.bbc.com/news/technology-35890188.38. Hunt, E. (2018). Tay, Microsoft’s AI chatbot, gets a crash course in racism from Twitter.[online] The Guardian.Available at: https://www.theguardian.com/technology/2016/mar/24/tay-microsofts-ai-chatbot-gets-a-crash-course-in-racism-from-twitter.39. Explains, K. (2020). Remembering Microsoft’s Chatbot disaster. [online] Medium.Available at: https://uxplanet.org/remembering-microsofts-chatbot-disaster-3a49d4a6331f40. Editor (2016). XiaoIce Vs. Tay: Two A.I. Chatbots, Two Different Outcomes - Sampi. co.[online]Available at: https://sampi.co/chinese-chatbot-xiaoice-vs-tay/41. & 42. Miller, K.W., Wolf, M.J. and Grodzinsky, F.S. (2017). Why We Should Have SeenThat Coming. ORBIT Journal, 1(2).doi:https://doi.org/10.29297/orbit.v1i2.4943. Leetaru, K. (n.d.). How Twitter Corrupted Microsoft’s Tay: A Crash Course In theDangers Of AI In The Real World. [online] Forbes.Available at: https://www.forbes.com/sites/kalevleetaru/2016/03/24/how-twitter-corrupted-microsofts-tay-a-crash-course-in-the-dangers-of-ai-in-the-real-world/?sh=ffeebf926d2844. https://plus.google.com/+UNESCO (2021). Problème de fille : briser les préjugés dansl’IA. [online] UNESCO.Available at: https://fr.unesco.org/girltrouble45. Zendesk. (n.d.). 7 ways to reduce bias in conversational AI. [online]Available at: https://www.zendesk.tw/blog/7-ways-reduce-bias-conversational-ai/#georedirect46. SoundHound. (n.d.). How to Overcome Cultural Bias in Voice AI Design. [online]Available at: https://www.soundhound.com/resources/how-to-overcome-cultural-bias-in-voice-ai-design/47. Dotan, R. (2023) Example of gender bias in CHATGPT, Ravit Dotan. Available at: https://www.ravitdotan.com/post/example-of-gender-bias-in-chatgpt48. Shafik and Case (eds.) (2022) Advances in manufacturing technology XXXV, IOS Press. Available at: https://www.iospress.com/catalog/books/advances-in-manufacturing-technology-xxxv49. Kotek, H. (2023) Doctors can’t get pregnant and other gender biases in chatgpt, Doctors can’t get pregnant and other gender biases in ChatGPT. Available at: https://hkotek.com/blog/gender-bias-in-chatgpt/Additional resources1. Wheeler, D.R. (2014). Why are Cortana and Siri female? [online] CNN.Available at: https://edition.cnn.com/2014/04/04/opinion/wheeler-cortana-siri/2. Abby Ohlheiser, covering digital culture (n.d.). Trolls turned Tay, Microsoft’s fun millennial AI bot, into a genocidal maniac. [online] Washington Post.Available at: https://www.washingtonpost.com/news/the-intersect/wp/2016/03/24/the-internet-turned-tay-microsofts-fun-millennial-ai-bot-into-a-genocidal-maniac/3. Mathur, V., Stavrakas, Y. and Singh, S. (2016). Intelligence analysis of Tay Twitter bot. [online] ResearchGate.Available at: https://www.researchgate.net/publication/316727714_Intelligence_analysis_of_Tay_Twitter_bot4. Nouri, S. (n.d.). Council Post: The Role Of Bias In Artificial Intelligence. [online] Forbes.Available at: https://www.forbes.com/sites/forbestechcouncil/2021/02/04/the-role-of-bias-in-artificial-intelligence/?sh=3d43f9f7579d5. UN News. (2019). Are robots sexist? UN report shows gender bias in talking digital tech.[online]Available at: https://news.un.org/en/story/2019/05/10386916. West, M., Kraut, R. and Chew, H.E. (2019). I’d blush if I could: closing gender divides indigital skills through education.Available at: https://unesdoc.unesco.org/ark:/48223/pf0000367416.page=1 描述 碩士
國立政治大學
國際經營管理英語碩士學位學程(IMBA)
111933056資料來源 http://thesis.lib.nccu.edu.tw/record/#G0111933056 資料類型 thesis dc.contributor.advisor 莊皓鈞 zh_TW dc.contributor.advisor Chuang, Howard en_US dc.contributor.author (Authors) 舒天宓 zh_TW dc.contributor.author (Authors) Thémis Saquet en_US dc.creator (作者) 舒天宓 zh_TW dc.creator (作者) Saquet, Thémis en_US dc.date (日期) 2023 en_US dc.date.accessioned 6-Jul-2023 16:34:53 (UTC+8) - dc.date.available 6-Jul-2023 16:34:53 (UTC+8) - dc.date.issued (上傳時間) 6-Jul-2023 16:34:53 (UTC+8) - dc.identifier (Other Identifiers) G0111933056 en_US dc.identifier.uri (URI) http://nccur.lib.nccu.edu.tw/handle/140.119/145809 - dc.description (描述) 碩士 zh_TW dc.description (描述) 國立政治大學 zh_TW dc.description (描述) 國際經營管理英語碩士學位學程(IMBA) zh_TW dc.description (描述) 111933056 zh_TW dc.description.abstract (摘要) In our post-COVID-19 societies, more and more consumers rely on conversational AI such as voice assistants or chatbots to perform any kind of task, from asking for the weather to having personal conversations. Companies have seized on this demand to continue developing their conversational AI to create an ever-better user experience. However, the racial or sexist biases implemented in these AIs distort the original user experience, sometimes creating a new one depending on when the bias is implemented. We will try to analyze the effect of these different biases on the user experience and know how they can distort the user experience, especially depending on the moment when these biases appear. To do so, we will analyze the biases in the case of voice assistants and interactive social chatbots through a case study between XiaoIce and MicrosoftTay. We will analyze the appearance of these biases and their effects using the three-stage framework for artificial intelligence in marketing. The main conclusions are that racial biases, mainly embedded because of insufficiently diverse data and engineers, and gender biases, tend to reinforce the structural inequalities that affect our societies. By reinforcing these inequalities, the user experience is negatively impacted in terms of accessibility, representation, and experience conveyed by the use of the product. en_US dc.description.tableofcontents 1 INTRODUCTION 12 VOICE ASSISTANTS 82.1 RACIAL BIAS IN VOICE ASSISTANTS 92.1.1 Marketing Research: How the critical step of data collection may influence the user experience from the very conception of the product 92.1.2 Marketing Strategy: How bias becomes a barrier to relevant positioning 142.1.3 An empirical measurement of biases to be corrected 162.1.4 Marketing Action: How a greater personalization through more diverse data and workforce is the way to go for companies 192.2 GENDER BIAS IN VOICE ASSISTANTS 212.2.1 Marketing Research and Strategy: How has a persisting historical bias made its way to these critical steps around the user experience? 212.2.2 Marketing Action: How better marketing practices can help stop these everlasting stereotypes? 273 CHATBOT 293.1 CONVERSATIONAL MARKETING 303.1.1 The benefits of unbiased conversational marketing 323.1.2 What strategies are used for conversational marketing? 393.1.3 Creating a bias-free user experience thanks to conversational marketing by leveraging AI in the different stages of marketing 403.1.3.1 Marketing Research 403.1.3.2 Marketing Strategy 423.1.3.3 Marketing Action 433.2 THE STUDY CASE OF INTERACTIVE SOCIAL CHATBOTS: HOW LETTING USERS IMPLEMENT THOSE BIASES CAN LEAD TO A TERRIBLE USER EXPERIENCE 444 CONCLUSION 525 BIBLIOGRAPHY 566 ADDITIONAL RESOURCES 61 zh_TW dc.format.extent 2705327 bytes - dc.format.mimetype application/pdf - dc.source.uri (資料來源) http://thesis.lib.nccu.edu.tw/record/#G0111933056 en_US dc.subject (關鍵詞) 偏見 zh_TW dc.subject (關鍵詞) 對話式人工智能 zh_TW dc.subject (關鍵詞) 對話式營銷 zh_TW dc.subject (關鍵詞) 用戶體驗 zh_TW dc.subject (關鍵詞) Bias en_US dc.subject (關鍵詞) Conversational Artificial Intelligence en_US dc.subject (關鍵詞) Conversational Marketing en_US dc.subject (關鍵詞) User Experience en_US dc.title (題名) 談話型AI如何扭曲與塑形全新的使用者體驗? zh_TW dc.title (題名) How do biases in conversational artificial intelligence distort and shape a new user experience? en_US dc.type (資料類型) thesis en_US dc.relation.reference (參考文獻) Bibliography1. De Cosmo, L. (2022) Google Engineer claims AI chatbot is sentient: Why that matters, Scientific American. Scientific American.Available at: https://www.scientificamerican.com/article/google-engineer-claims-ai-chatbot-is-sentient-why-that-matters/2. Pourquoi l`IA Conversationnelle ? IBM.Available at: https://www.ibm.com/fr-fr/topics/conversational-ai#:~:text=L`intelligence%20artificielle%20(IA),auxquelles%20les%20utilisateurs%20peuvent%20parler3. Le fonctionnement des assistants vocaux en 5 étapes (2020) CNIL. Commission nationale de l`informatique et des libertés.Available at: https://www.cnil.fr/fr/le-fonctionnement-des-assistants-vocaux-en-5-etapes#:~:text=Un%20assistant%20vocal%20est%20un,la%20requ%C3%AAte%20d`un%20utilisateur4. Qu`est-Ce Qu`un Biais Cognitif ? Définition Biais cognitif (2023) USABILIS.Available at: https://www.usabilis.com/definition-biais-cognitifs/5. Spradlin, L.K. (2012) Diversity matters: Understanding diversity in schools. Belmont, CA: Wadsworth, Cengage Learning.6. Braswell, P. (2022) This is the difference between racism and racial bias, Fast Company. Available at: https://www.fastcompany.com/90796690/this-is-the-difference-between-racism-and-racial-bias7. Eliminating gender bias in conversational AI: Strategies for fair and Inclusive Conversations (2023) Aivo.Available at: https://www.aivo.co/blog/eliminating-gender-bias-in-conversational-ai-strategies-for-fair-and-inclusive-conversations8. User experience (2020) What is User Experience? | Definition and Overview.Available at: https://www.productplan.com/glossary/user-experience/9. What is user experience (UX) design? (2022) The Interaction Design Foundation. Interaction Design Foundation.Available at: https://www.interaction-design.org/literature/topics/ux-design10. Huang, M.-H. and Rust, R.T. (2020) A strategic framework for artificial intelligence in marketing - journal of the Academy of Marketing Science, SpringerLink. Springer US. Available at: https://link.springer.com/article/10.1007/s11747-020-00749-911. Edison Research (2020) The smart audio report 2020 from NPR and Edison Research, Edison Research. Edison Research http://www.edisonresearch.com/wp-content/uploads/2014/06/edison-logo-300x137.jpg.Available at: https://www.edisonresearch.com/the-smart-audio-report-2020-from-npr-and-edison-research/12. Transcript: Ai from above (EP3) (2022) The Internet Health Report 2022.Available at: https://2022.internethealthreport.org/transcript-ai-from-above-ep3/13. Harwell, D. (2018) The accent gap: How Amazon`s and Google`s smart speakers leave certain voices behind, The Washington Post. WP Company.Available at: https://www.washingtonpost.com/graphics/2018/business/alexa-does-not-understand-your-accent/14. Sidnell, J. (2019). African American Vernacular English. [online] Hawaii.edu.Available at: https://www.hawaii.edu/satocenter/langnet/definitions/aave.html.15. Koenecke, A. et al. (2020) Racial disparities in automated speech recognition | PNAS, Proceedings of the National Academy of Sciences, PNAS.Available at: https://www.pnas.org/doi/10.1073/pnas.191576811716. Online resources for African American language (no date) CORAAL | Online Resources for African American Language.Available at: https://oraal.uoregon.edu/coraal17. Lloreda, C.L. (2020). Speech Recognition Tech Is Yet Another Example of Bias.Scientific American.Available at: https://www.scientificamerican.com/article/speech-recognition-tech-is-yet-another-example-of-bias/18. Cantone, J.A., Martinez, L.N., Willis-Esqueda, C. and Miller, T. (2019). Sounding guilty:How accent bias affects juror judgments of culpability. Journal of Ethnicity in Criminal Justice, 17(3), pp.228–253.doi:https://doi.org/10.1080/15377938.2019.162396319. Baquiran, C.L.C. and Nicoladis, E. (2019). A Doctor’s Foreign Accent AffectsPerceptions of Competence. Health Communication, 35(6), pp.726–730. doi:https://doi.org/10.1080/10410236.2019.158477920. Baraniuk, C. (2022). Why your voice assistant might be sexist. [online] www.bbc.com.Available at: https://www.bbc.com/future/article/20220614-why-your-voice-assistant-might-be-sexist21. CNN, B.B.G. (n.d.). Why computer voices are mostly female. [online] CNN.Available at: https://edition.cnn.com/2011/10/21/tech/innovation/female-computer-voices/22. Gizmodo. (n.d.). No, Women’s Voices Are Not Easier to Understand Than Men’s Voices.[online]Available at: https://gizmodo.com/no-siri-is-not-female-because-womens-voices-are-easier-168390164323. Halo· (2023). How many copies did Halo sell? — 2023 statistics | LEVVVEL. [online]levvvel.com.Available at: https://levvvel.com/halo-statistics/#:~:text=The%20Halo%20series%20has%20sold%2081%20million%20copies.&text=The%20series%20went%20from%206524. Franceinfo. (2020). Nouveau monde. Pourquoi les prénoms de Siri et d’Alexa pour desassistants vocaux ? [online]Available at: https://www.francetvinfo.fr/replay-radio/nouveau-monde/nouveau-monde-pourquoi-les-prenoms-de-siri-et-d-alexa-pour-des-assistants-vocaux_4045705.html25. Gizmodo. (2015). Why Is My Digital Assistant So Creepy? [online]Available at: https://gizmodo.com/why-is-my-digital-assistant-so-creepy-168221642326. World Economic Forum. (n.d.). Hey Siri, you’re sexist, finds UN report on genderedtechnology. [online]Available at: https://www.weforum.org/agenda/2019/05/hey-siri-youre-sexist-finds-u-n-report-on-gendered-technology/27. www.ibm.com. (n.d.). Qu’est-ce qu’un agent conversationnel ? | IBM. [online]Available at: https://www.ibm.com/fr-fr/topics/chatbots28., 29. & 30. Salesforce.com. (n.d.). State of the Connected Customer Report. [online]Available at: https://www.salesforce.com/resources/research-reports/state-of-the-connected-customer/.31. Drift. (2021). 2021 State of Conversational Marketing. [online]Available at: https://www.drift.com/books-reports/conversational-marketing-trends/#state-of-convo-marketing.32. Weizenbaum, J. (1966). ELIZA---a computer program for the study of natural languagecommunication between man and machine. Communications of the ACM, [online] 9(1), pp.36–45.doi:https://doi.org/10.1145/365153.365168.33. Colby, K.M., Weber, S. and Hilf, F.D. (1971). Artificial Paranoia. Artificial Intelligence,2(1), pp.1–25.doi:https://doi.org/10.1016/0004-3702(71)90002-6.34. Wallace, R.S. (2007). The Anatomy of A.L.I.C.E. Parsing the Turing Test, [online]pp.181–210.doi:https://doi.org/10.1007/978-1-4020-6710-5_13.35. Zhou, L., Gao, J., Li, D. and Shum, H.-Y. (2020). The Design and Implementation ofXiaoIce, an Empathetic Social Chatbot. Computational Linguistics, 46(1), pp.53–93.doi:https://doi.org/10.1162/coli_a_00368.36. Quach, K. (n.d.). Microsoft chatbots: Sweet XiaoIce vs foul-mouthed Tay. [online]www.theregister.com.Available at: https://www.theregister.com/2016/09/29/microsofts_chatbots_show_cultural_differences_between_the_east_and_west/37. Microsoft chatbot is taught to swear on Twitter. (2016). BBC News. [online] 24 Mar.Available at: https://www.bbc.com/news/technology-35890188.38. Hunt, E. (2018). Tay, Microsoft’s AI chatbot, gets a crash course in racism from Twitter.[online] The Guardian.Available at: https://www.theguardian.com/technology/2016/mar/24/tay-microsofts-ai-chatbot-gets-a-crash-course-in-racism-from-twitter.39. Explains, K. (2020). Remembering Microsoft’s Chatbot disaster. [online] Medium.Available at: https://uxplanet.org/remembering-microsofts-chatbot-disaster-3a49d4a6331f40. Editor (2016). XiaoIce Vs. Tay: Two A.I. Chatbots, Two Different Outcomes - Sampi. co.[online]Available at: https://sampi.co/chinese-chatbot-xiaoice-vs-tay/41. & 42. Miller, K.W., Wolf, M.J. and Grodzinsky, F.S. (2017). Why We Should Have SeenThat Coming. ORBIT Journal, 1(2).doi:https://doi.org/10.29297/orbit.v1i2.4943. Leetaru, K. (n.d.). How Twitter Corrupted Microsoft’s Tay: A Crash Course In theDangers Of AI In The Real World. [online] Forbes.Available at: https://www.forbes.com/sites/kalevleetaru/2016/03/24/how-twitter-corrupted-microsofts-tay-a-crash-course-in-the-dangers-of-ai-in-the-real-world/?sh=ffeebf926d2844. https://plus.google.com/+UNESCO (2021). Problème de fille : briser les préjugés dansl’IA. [online] UNESCO.Available at: https://fr.unesco.org/girltrouble45. Zendesk. (n.d.). 7 ways to reduce bias in conversational AI. [online]Available at: https://www.zendesk.tw/blog/7-ways-reduce-bias-conversational-ai/#georedirect46. SoundHound. (n.d.). How to Overcome Cultural Bias in Voice AI Design. [online]Available at: https://www.soundhound.com/resources/how-to-overcome-cultural-bias-in-voice-ai-design/47. Dotan, R. (2023) Example of gender bias in CHATGPT, Ravit Dotan. Available at: https://www.ravitdotan.com/post/example-of-gender-bias-in-chatgpt48. Shafik and Case (eds.) (2022) Advances in manufacturing technology XXXV, IOS Press. Available at: https://www.iospress.com/catalog/books/advances-in-manufacturing-technology-xxxv49. Kotek, H. (2023) Doctors can’t get pregnant and other gender biases in chatgpt, Doctors can’t get pregnant and other gender biases in ChatGPT. Available at: https://hkotek.com/blog/gender-bias-in-chatgpt/Additional resources1. Wheeler, D.R. (2014). Why are Cortana and Siri female? [online] CNN.Available at: https://edition.cnn.com/2014/04/04/opinion/wheeler-cortana-siri/2. Abby Ohlheiser, covering digital culture (n.d.). Trolls turned Tay, Microsoft’s fun millennial AI bot, into a genocidal maniac. [online] Washington Post.Available at: https://www.washingtonpost.com/news/the-intersect/wp/2016/03/24/the-internet-turned-tay-microsofts-fun-millennial-ai-bot-into-a-genocidal-maniac/3. Mathur, V., Stavrakas, Y. and Singh, S. (2016). Intelligence analysis of Tay Twitter bot. [online] ResearchGate.Available at: https://www.researchgate.net/publication/316727714_Intelligence_analysis_of_Tay_Twitter_bot4. Nouri, S. (n.d.). Council Post: The Role Of Bias In Artificial Intelligence. [online] Forbes.Available at: https://www.forbes.com/sites/forbestechcouncil/2021/02/04/the-role-of-bias-in-artificial-intelligence/?sh=3d43f9f7579d5. UN News. (2019). Are robots sexist? UN report shows gender bias in talking digital tech.[online]Available at: https://news.un.org/en/story/2019/05/10386916. West, M., Kraut, R. and Chew, H.E. (2019). I’d blush if I could: closing gender divides indigital skills through education.Available at: https://unesdoc.unesco.org/ark:/48223/pf0000367416.page=1 zh_TW