學術產出-Theses

Article View/Open

Publication Export

Google ScholarTM

政大圖書館

Citation Infomation

題名 大數據經濟發展下企業利用個資方式所面臨的個資法規範分析—以金控公司建立客戶信用分數的剖析模型(Profiling)專案為例
Personal Data Protection in the Era of Big Data — Case Study of a Taiwanese Financial Holding Company’s Customer Profiling Practice
作者 鄭依明
Cheng, Yi-Ming
貢獻者 鄭菀瓊
Cheng, Wan-Chiung
鄭依明
Cheng, Yi-Ming
關鍵詞 人工智慧
數據分析
個資法
個資管理
隱私衝擊評估
個案研究
剖析模型
大數據經濟
AI(artificial intelligence)
Data analysis
Personal data protection
Personal data management
Privacy risk assessment
Case study
Profiling model
Big data
日期 2019
上傳時間 7-Aug-2019 17:05:05 (UTC+8)
摘要 隨著全球大數據經濟不斷的發展,企業利用手邊數據進行新商業機會開發的行為,逐漸成為受到普遍重視的新商業模式發展策略。尤其是對於面臨著產業轉型需求與傳統產業升級壓力的台灣企業來說,這些企業正嘗試跟隨這波浪潮,不斷的投入公司資源進行數據的利用與研究,以優化舊有服務與打造新服務為目的,盡可能尋找各種能為公司創造價值的數據利用模式。其中,利用大量的個人資料建立客戶剖析模型以創造新商業價值的模式,更是許多企業在近年來成功利用於消費者分析、精準行銷與客製化服務設計等領域之典範,而被視為擁有極大發展潛力的個人資料利用方式。

但企業所追求極大化個人資料價值的利用目的,與個資法中資料隱私權給予資料擁有者控制個人資料的保護目的,兩者是相互矛盾的。針對企業應該如何平衡該矛盾的議題探討,雖然目前相關討論的文獻漸多,但大多數僅以理論面的探討為主,而少有企業內部實際運作的討論,因而難以得知企業實際面對此議題所產生的情境為何。

因此,本文提供了一實際企業進行剖析模型開發案例的介紹,並整理個資法關於個資保護範圍、個資管理模式的現有規範,嘗試分析出企業在個資法的規範下所面臨的問題與困難,並給予未來要利用個人資料進行剖析模型建立的企業一個完整的參考依據。而本研究將藉由與台灣知名金控公司的大數據團隊管理者進行深度訪談,獲取剖析模型建立的實際資料處理情境,並同時進行關於個資保護範圍、企業個資管理模式的文獻探討,最後也會包含結合理論面與實務面綜合分析。

本研究結果發現,由於建立剖析模型的資料處理情境十分複雜且擁有高流程變動性,同時加上有人工智慧特性的演算法參與使得資料處理結果難以預期,造成了企業高昂的管理成本與在資料管理上的限制。而這樣的情形將導致企業在剖析模型建立的過程中容易忽略了對於個人資料的保護,產生許多潛在的隱私侵害來源。因此,企業應在過程中進行隱私衝擊評估,並密切關注隱私衝擊來源的轉換,建立有效率的隱私風險管理模式,以平衡在大數據時代下利用個資行為與個資法保護目的之矛盾。
With the continuous growth in global big data market, it becomes more and more essential for companies to develop their own strategy so as to find new business opportunities by exploiting data. Especially for companies in Taiwan which face the desperate need of transformation in conventional industries and industries upgrading, they are trying to follow up the overwhelming trend of big data by constantly investing resources in exploiting data and related research.

Recently, the method of using lots of personal data to build customer profiling models has been considered as one of the most promising ways to exploit personal data. However, it may somehow cause conflicts between the goal that companies pursue to maximize the value of personal data and the core value that personal data protection law provides people the right to control their own personal data. In spite of the fact that there are more and more researches about the issue of how to resolve the contradiction, most of them were mainly focus on theoretical discussion and yet lack of providing practical cases inside the company. Therefore, the real impact for company in dealing with this issue is still not unveiled.

Accordingly, this paper provides an actual profiling model building scenario to discuss this problem. In addition, it sorts out the regulations of the personal data protection law, trying to figure out what kind of the data should be protected and how should the company manage them, providing comprehensive suggestions for companies which are going to build profiling model with personal data in the future.

To sum up, the process of building profiling model is so sophisticated and fluctuated as well as the implementation of AI Algorithm which makes output even more unpredictable, the company are now facing heavy managerial cost and the limitation of data management. That leads to the ignorance and reluctance of personal data protection for company; meanwhile, it may also cause lots of potential privacy violations. Therefore, the company should keep an eye on the possible sources of the privacy risks from time to time and establish the adapting risk control system effectively.
參考文獻 ㄧ、中文參考文獻

1. 宋皇志。(2018)。巨量資料交易之法律風險與管理意涵-以個人資料再識別化為中心。 管理評論, 37(4),37-51。
2. 林鴻文。(2013)。個人資料保護法。台北市:書泉出版社。
3. 法務部。(2010)。電腦處理個人資料保護法修正條文對照表。 取自:https://www.moj.gov.tw/dl-19613-da3da130e2ed464fbaf534929cfc9fb0.html
4. 法務部。(2011)。電腦處理個人資料保護法施行細則修正草案條文對照表。取自:http://www.moj.gov.tw/public/Attachment/1102710165577.pdf
5. 法務部。(2016)。公務機關利用去識別化資料之合理風險控制及法律責任。取自:http://ws.ndc.gov.tw/Download.ashx?u=LzAwMS9hZG1pbmlzdHJhdG9yLzEwL2NrZmlsZS9jMzRiN2YzNy03ZjgwLTRiMmQtOTliYS02NWZjNTcyNzczNmQucGRm&n=KDEp6Kqy56iL5ZCN56ixLeWAi%2BS6uuizh%2BaWmeWOu%2BitmOWIpeWMluS5i%2BWIpOaWt%2Baomea6luWPiuebuOmXnOazleW%2Bi%2BiyrOS7u%2BaOouioji5wZGY%3D
6. 范姜真媺。(2013)。個人資料保護法關於「個人資料」保護範圍之檢討。東海大學法學研究(41),91-123。
7. 徐仕瑋。(2015)。從「電信業者別」看我國個資法之適用。人權會訊(115),37-38。
8. 財政部財政資訊中心。(2016)。個人資料去識別化過程驗證案例報告。取自:https://ws.ndc.gov.tw/Download.ashx?u=LzAwMS9hZG1pbmlzdHJhdG9yLzEwL2NrZmlsZS83YzI2YzEwMy0yNzU4LTQ0YTMtODg0Mi03N2ZmNTBkMzNhMWIucGRm&n=KDQp6Kqy56iL5ZCN56ixLeWAi%2BS6uuizh%2BaWmeWOu%2BitmOWIpeWMlumBjueoi%2Bmpl%2BitieahiOS%2Bi%2BWgseWRii5wZGY%3D
9. 張陳弘。(2016)。個人資料之認定-個人資料保護法適用之啟動閥。法令月刊,67(5),67-101。
10. 張陳弘。(2018)。新興科技下的資訊隱私保護:[告知後同意原則] 的侷限性與修正方法之提出。臺大法學論叢,47(1),201-297。
11. 陳佑寰。(2016)。鬆綁部份告知同意規定-個資法修正有利網路產業。取自:網管人 https://www.netadmin.com.tw/article_content.aspx?sn=1601110017
12. 彭金隆、陳俞沛與孫群。(2017)。巨量資料應用在台灣個資法架構下的法律風險。臺大管理論叢,27(2S),93-118。
13. 黃彥棻。(2012)。個資法細則送審,新版草案取消軌跡資料。ithome。取自:https://www.ithome.com.tw/node/74891
14. 經濟部。(2018a)。個資流程衝擊分析表、填寫說明及填寫範例。 取自:https://www.moea.gov.tw/MNS/COLR/content/wHandMenuFile.ashx?file_id=17630
15. 經濟部。(2018b)。經濟部個人資料保護作業手冊107年3月版。取自:https://www.moea.gov.tw/MNS/COLR/content/wHandMenuFile.ashx?file_id=17629
16. 經濟部標準檢驗局。(2016)。「個人資料去識別化」驗證標準規範研訂及推廣。取自:https://www.slideshare.net/vtaiwan/ss-58562437
17. 經濟部標準檢驗局。(2018)。專題報導-資訊大爆炸時代,標準保障使用者的資訊安全。取自:標準資料電子報 http://fsms.bsmi.gov.tw/cat/epaper/0706.html
18. 葉志良。(2016)。大數據應用下個人資料定義的檢討:以我國法院判決為例。[The Adjustment of the Definition of Personal Information in the Age of Big Data: From a Perspective of Court case]。資訊社會研究(31),1-33。
19. 葉志良。(2017)。大數據應用下個人資料的法律保護。人文與社會科學簡訊,19(1),6。
20. 廖緯民。(1996)。論資訊時代的隱私權保護-以「資訊隱私權」為中心。資訊法務透析,8(11),20-27。doi:10.7062/INFOLAW.199611.0013
21. 樊國楨、蔡昀臻。(2016)。個人資料去識別之標準化的進程初探:根基於ISO/IEC 2nd WD 20889:2016-05-30。標準與檢驗雙月刊,196,24。
22. 蔡昀臻。(2016)。巨量資料發布系統之個人資料去識別化要求事項初論。(碩士),國立交通大學,新竹市。取自:https://hdl.handle.net/11296/uk35xw
23. 鍾孝宇。(2016)。巨量資料與隱私權─個人資料保護機制的再思考。(碩士),國立政治大學,台北市。取自:https://hdl.handle.net/11296/hc3kkv

英文參考文獻

24. APEC privacy framework.(2005). Paper presented at the Asia Pacific Economic Cooperation Secretariat. http://publications.apec.org/-/media/APEC/Publications/2005/12/APEC-Privacy-Framework/05_ecsg_privacyframewk.pdf
25. Barbaro, M, & Zeller Jr., T.(2006). A Face Is Exposed for AOL Searcher No. 4417749. The New York Times Retrieved from https://www.nytimes.com/2006/08/09/technology/09aol.html
26. Bing, J. (1984). The Council of Europe Convention and OECD Guidelines on Data Protection. Mich. YBI Legal Stud., 5, 271.
27. Calo, R.(2013). Against notice skepticism in privacy(and elsewhere). Notre Dame Law Review, 87(3), 1027.
28. Cooley, T. M., & Lewis, J.(1907). A treatise on the law of torts, or the wrongs which arise independently of contract. Chicago: Callaghan & company.
29. Davis, W.(2013). AOL Settles Data Valdez Lawsuit For $5 Million. from Media Post https://www.mediapost.com/publications/article/193831/aol-settles-data-valdez-lawsuit-for-5-million.html
30. El Emam, K.(2011). Methods for the de-identification of electronic health records for genomic research. Genome Medicine, 3(4), 25. doi:10.1186/gm239
31. Garfinkel, S. L.(2015a). The Data Identifiability Spectrum. In De-identification of personal information (Ed.).
32. Garfinkel, S. L.(2015b). De-identification of personal information. Retrieved from https://nvlpubs.nist.gov/nistpubs/ir/2015/NIST.IR.8053.pdf
33. Harford, T.(2014). Big data: A big mistake? Significance, 11(5), 14-19.
34. Information and privacy commissioner of Ontario(2016). De-identification Guidelines for Structured Data.
35. King, N. J., & Forder, J.(2016). Data analytics and consumer profiling: Finding appropriate privacy principles for discovered data. Computer Law & Security Review, 32(5), 696-714. doi:https://doi.org/10.1016/j.clsr.2016.05.002
36. Li, C. (2016). A Gentle introduction to gradient boosting. URL: http://www.ccs.neu. edu/home/vip/teach/MLcourse/4_ boosting/slides/gradient_boosting.pdf.
37. Longhurst, R.(2003). Semi-structured interviews and focus groups. Key methods in geography, 3, 143-156.
38. Malin, B.(2013). A De-identification Strategy Used for Sharing One Data Provider’s Oncology Trials Data through the Project Data Sphere® Repository. Project Data Sphere.
39. Mantelero, A.(2014). The future of consumer data protection in the EU Re-thinking the “notice and consent” paradigm in the new era of predictive analytics. Computer Law Security Review, 30(6), 643-660.
40. Mayer-Schönberger, V., & Cukier, K.(2014). Big Data: A Revolution That Will Transform How We Live, Work, and Think: Houghton Mifflin Harcourt.
41. McCallister, E.(2010). Guide to protecting the confidentiality of personally identifiable information: Diane Publishing.
42. Nissenbaum, H.(2011). A Contextual Approach to Privacy Online. Daedalus, 140(4), 32-48. doi:10.1162/DAED_a_00113
43. Palinkas, L. A., Horwitz, S. M., Green, C. A., Wisdom, J. P., Duan, N., & Hoagwood, K.(2015). Purposeful Sampling for Qualitative Data Collection and Analysis in Mixed Method Implementation Research. Administration and policy in mental health, 42(5), 533-544. doi:10.1007/s10488-013-0528-y
44. Article 29 Data Protection Working Party.(2007). Opinion 4/2007 on the concept of personal data.
45. Article 29 Data Protection Working Party.(2017). Guidelines on Data Protection Impact Assessment(DPIA)and determining whether processing is “likely to result in a high risk” for the purposes of Regulation 2016/679.
46. Executive Office of the President, Munoz, C., Director, D. P. C., Megan (US Chief Technology Officer Smith (Office of Science and Technology Policy)), & DJ (Deputy Chief Technology Officer for Data Policy and Chief Data Scientist Patil (Office of Science and Technology Policy)). (2016). Big data: A report on algorithmic systems, opportunity, and civil rights. Executive Office of the President.
47. Rustici, C.(2018). GDPR Profiling and Business Practice. In Computer Law Review International(Vol. 19, pp. 34).
48. Schermer, B. W.(2011). The limits of privacy in automated profiling and data mining. Computer Law & Security Review, 27(1), 45-52. doi:https://doi.org/10.1016/j.clsr.2010.11.009
49. Schwartz, P. M., & Solove, D. J.(2011). The PII problem: Privacy and a new concept of personally identifiable information. NYUL rev., 86, 1814.
50. President`s Council of Advisors on Science Technology.(2014). Report to the President, Big Data and Privacy: A Technology Perspective. Executive Office of the President, President`s Council of Advisors on Science and Technology
51. Sweeney, L.(2002). k-anonymity: A model for protecting privacy. International Journal of Uncertainty, Fuzziness Knowledge-Based Systems, 10(05), 557-570.
52. United States v. Knotts.(1983). In US Report(Vol. 460, pp. 276): United States Supreme Court
53. Voigt, P., & Von dem Bussche, A.(2017). The EU General Data Protection Regulation(GDPR)(Vol. 18): Springer.
54. Warren, S. D., & Brandeis, L. D.(1890). The Right to Privacy. Harvard Law Review, 4(5). doi:10.2307/1321160
55. Zuiderveen Borgesius, F. J.(2016). Singling out people without knowing their names – Behavioural targeting, pseudonymous data, and the new Data Protection Regulation. Computer Law & Security Review, 32(2), 256-271. doi:https://doi.org/10.1016/j.clsr.2015.12.013
描述 碩士
國立政治大學
科技管理與智慧財產研究所
105364210
資料來源 http://thesis.lib.nccu.edu.tw/record/#G1053642101
資料類型 thesis
dc.contributor.advisor 鄭菀瓊zh_TW
dc.contributor.advisor Cheng, Wan-Chiungen_US
dc.contributor.author (Authors) 鄭依明zh_TW
dc.contributor.author (Authors) Cheng, Yi-Mingen_US
dc.creator (作者) 鄭依明zh_TW
dc.creator (作者) Cheng, Yi-Mingen_US
dc.date (日期) 2019en_US
dc.date.accessioned 7-Aug-2019 17:05:05 (UTC+8)-
dc.date.available 7-Aug-2019 17:05:05 (UTC+8)-
dc.date.issued (上傳時間) 7-Aug-2019 17:05:05 (UTC+8)-
dc.identifier (Other Identifiers) G1053642101en_US
dc.identifier.uri (URI) http://nccur.lib.nccu.edu.tw/handle/140.119/125031-
dc.description (描述) 碩士zh_TW
dc.description (描述) 國立政治大學zh_TW
dc.description (描述) 科技管理與智慧財產研究所zh_TW
dc.description (描述) 105364210zh_TW
dc.description.abstract (摘要) 隨著全球大數據經濟不斷的發展,企業利用手邊數據進行新商業機會開發的行為,逐漸成為受到普遍重視的新商業模式發展策略。尤其是對於面臨著產業轉型需求與傳統產業升級壓力的台灣企業來說,這些企業正嘗試跟隨這波浪潮,不斷的投入公司資源進行數據的利用與研究,以優化舊有服務與打造新服務為目的,盡可能尋找各種能為公司創造價值的數據利用模式。其中,利用大量的個人資料建立客戶剖析模型以創造新商業價值的模式,更是許多企業在近年來成功利用於消費者分析、精準行銷與客製化服務設計等領域之典範,而被視為擁有極大發展潛力的個人資料利用方式。

但企業所追求極大化個人資料價值的利用目的,與個資法中資料隱私權給予資料擁有者控制個人資料的保護目的,兩者是相互矛盾的。針對企業應該如何平衡該矛盾的議題探討,雖然目前相關討論的文獻漸多,但大多數僅以理論面的探討為主,而少有企業內部實際運作的討論,因而難以得知企業實際面對此議題所產生的情境為何。

因此,本文提供了一實際企業進行剖析模型開發案例的介紹,並整理個資法關於個資保護範圍、個資管理模式的現有規範,嘗試分析出企業在個資法的規範下所面臨的問題與困難,並給予未來要利用個人資料進行剖析模型建立的企業一個完整的參考依據。而本研究將藉由與台灣知名金控公司的大數據團隊管理者進行深度訪談,獲取剖析模型建立的實際資料處理情境,並同時進行關於個資保護範圍、企業個資管理模式的文獻探討,最後也會包含結合理論面與實務面綜合分析。

本研究結果發現,由於建立剖析模型的資料處理情境十分複雜且擁有高流程變動性,同時加上有人工智慧特性的演算法參與使得資料處理結果難以預期,造成了企業高昂的管理成本與在資料管理上的限制。而這樣的情形將導致企業在剖析模型建立的過程中容易忽略了對於個人資料的保護,產生許多潛在的隱私侵害來源。因此,企業應在過程中進行隱私衝擊評估,並密切關注隱私衝擊來源的轉換,建立有效率的隱私風險管理模式,以平衡在大數據時代下利用個資行為與個資法保護目的之矛盾。
zh_TW
dc.description.abstract (摘要) With the continuous growth in global big data market, it becomes more and more essential for companies to develop their own strategy so as to find new business opportunities by exploiting data. Especially for companies in Taiwan which face the desperate need of transformation in conventional industries and industries upgrading, they are trying to follow up the overwhelming trend of big data by constantly investing resources in exploiting data and related research.

Recently, the method of using lots of personal data to build customer profiling models has been considered as one of the most promising ways to exploit personal data. However, it may somehow cause conflicts between the goal that companies pursue to maximize the value of personal data and the core value that personal data protection law provides people the right to control their own personal data. In spite of the fact that there are more and more researches about the issue of how to resolve the contradiction, most of them were mainly focus on theoretical discussion and yet lack of providing practical cases inside the company. Therefore, the real impact for company in dealing with this issue is still not unveiled.

Accordingly, this paper provides an actual profiling model building scenario to discuss this problem. In addition, it sorts out the regulations of the personal data protection law, trying to figure out what kind of the data should be protected and how should the company manage them, providing comprehensive suggestions for companies which are going to build profiling model with personal data in the future.

To sum up, the process of building profiling model is so sophisticated and fluctuated as well as the implementation of AI Algorithm which makes output even more unpredictable, the company are now facing heavy managerial cost and the limitation of data management. That leads to the ignorance and reluctance of personal data protection for company; meanwhile, it may also cause lots of potential privacy violations. Therefore, the company should keep an eye on the possible sources of the privacy risks from time to time and establish the adapting risk control system effectively.
en_US
dc.description.tableofcontents 第一章 緒論 1
第一節 研究動機與目的 1
第二節 研究方法 3
第一項 研究範圍與研究方法 3
第二項 研究規劃與章節安排 6
第二章 台灣個資法對於企業利用個資的影響 8
第一節 台灣個資法的發展與修法背景簡介 8
第二節 個資法保護內涵:資訊隱私權 12
第三節 個資保護範圍的探討 13
第一項 間接識別性資料的界線 14
第二項 去識別化資料的界線 23
第三項 連結性資料的界線 28
第四項 合理隱私期待資料的界線 32
第五項 小結 34
第四節 資料的個資法適用性判斷 36
第三章 企業利用個人資料之管理模式 40
第一節 告知與同意原則 40
第二節 去識別化管理 43
第三節 資料處理流程的隱私衝擊評估 50
第四節 GDPR中針對建立剖析模型行為的管理規範 55
第四章 企業建立剖析模型實際案例分析 59
第一節 案例介紹—金控公司的信用指標建立專案 59
第一項 剖析模型建立動機 60
第二項 剖析模型建立流程介紹 61
第二節 資料的個資法適用性分析 72
第三節 資料處理的隱私衝擊評估 81
第五章 結論 92
第一節 研究發現 92
第二節 研究限制與建議 97
參考文獻 99
附錄 105
zh_TW
dc.format.extent 3157303 bytes-
dc.format.mimetype application/pdf-
dc.source.uri (資料來源) http://thesis.lib.nccu.edu.tw/record/#G1053642101en_US
dc.subject (關鍵詞) 人工智慧zh_TW
dc.subject (關鍵詞) 數據分析zh_TW
dc.subject (關鍵詞) 個資法zh_TW
dc.subject (關鍵詞) 個資管理zh_TW
dc.subject (關鍵詞) 隱私衝擊評估zh_TW
dc.subject (關鍵詞) 個案研究zh_TW
dc.subject (關鍵詞) 剖析模型zh_TW
dc.subject (關鍵詞) 大數據經濟zh_TW
dc.subject (關鍵詞) AI(artificial intelligence)en_US
dc.subject (關鍵詞) Data analysisen_US
dc.subject (關鍵詞) Personal data protectionen_US
dc.subject (關鍵詞) Personal data managementen_US
dc.subject (關鍵詞) Privacy risk assessmenten_US
dc.subject (關鍵詞) Case studyen_US
dc.subject (關鍵詞) Profiling modelen_US
dc.subject (關鍵詞) Big dataen_US
dc.title (題名) 大數據經濟發展下企業利用個資方式所面臨的個資法規範分析—以金控公司建立客戶信用分數的剖析模型(Profiling)專案為例zh_TW
dc.title (題名) Personal Data Protection in the Era of Big Data — Case Study of a Taiwanese Financial Holding Company’s Customer Profiling Practiceen_US
dc.type (資料類型) thesisen_US
dc.relation.reference (參考文獻) ㄧ、中文參考文獻

1. 宋皇志。(2018)。巨量資料交易之法律風險與管理意涵-以個人資料再識別化為中心。 管理評論, 37(4),37-51。
2. 林鴻文。(2013)。個人資料保護法。台北市:書泉出版社。
3. 法務部。(2010)。電腦處理個人資料保護法修正條文對照表。 取自:https://www.moj.gov.tw/dl-19613-da3da130e2ed464fbaf534929cfc9fb0.html
4. 法務部。(2011)。電腦處理個人資料保護法施行細則修正草案條文對照表。取自:http://www.moj.gov.tw/public/Attachment/1102710165577.pdf
5. 法務部。(2016)。公務機關利用去識別化資料之合理風險控制及法律責任。取自:http://ws.ndc.gov.tw/Download.ashx?u=LzAwMS9hZG1pbmlzdHJhdG9yLzEwL2NrZmlsZS9jMzRiN2YzNy03ZjgwLTRiMmQtOTliYS02NWZjNTcyNzczNmQucGRm&n=KDEp6Kqy56iL5ZCN56ixLeWAi%2BS6uuizh%2BaWmeWOu%2BitmOWIpeWMluS5i%2BWIpOaWt%2Baomea6luWPiuebuOmXnOazleW%2Bi%2BiyrOS7u%2BaOouioji5wZGY%3D
6. 范姜真媺。(2013)。個人資料保護法關於「個人資料」保護範圍之檢討。東海大學法學研究(41),91-123。
7. 徐仕瑋。(2015)。從「電信業者別」看我國個資法之適用。人權會訊(115),37-38。
8. 財政部財政資訊中心。(2016)。個人資料去識別化過程驗證案例報告。取自:https://ws.ndc.gov.tw/Download.ashx?u=LzAwMS9hZG1pbmlzdHJhdG9yLzEwL2NrZmlsZS83YzI2YzEwMy0yNzU4LTQ0YTMtODg0Mi03N2ZmNTBkMzNhMWIucGRm&n=KDQp6Kqy56iL5ZCN56ixLeWAi%2BS6uuizh%2BaWmeWOu%2BitmOWIpeWMlumBjueoi%2Bmpl%2BitieahiOS%2Bi%2BWgseWRii5wZGY%3D
9. 張陳弘。(2016)。個人資料之認定-個人資料保護法適用之啟動閥。法令月刊,67(5),67-101。
10. 張陳弘。(2018)。新興科技下的資訊隱私保護:[告知後同意原則] 的侷限性與修正方法之提出。臺大法學論叢,47(1),201-297。
11. 陳佑寰。(2016)。鬆綁部份告知同意規定-個資法修正有利網路產業。取自:網管人 https://www.netadmin.com.tw/article_content.aspx?sn=1601110017
12. 彭金隆、陳俞沛與孫群。(2017)。巨量資料應用在台灣個資法架構下的法律風險。臺大管理論叢,27(2S),93-118。
13. 黃彥棻。(2012)。個資法細則送審,新版草案取消軌跡資料。ithome。取自:https://www.ithome.com.tw/node/74891
14. 經濟部。(2018a)。個資流程衝擊分析表、填寫說明及填寫範例。 取自:https://www.moea.gov.tw/MNS/COLR/content/wHandMenuFile.ashx?file_id=17630
15. 經濟部。(2018b)。經濟部個人資料保護作業手冊107年3月版。取自:https://www.moea.gov.tw/MNS/COLR/content/wHandMenuFile.ashx?file_id=17629
16. 經濟部標準檢驗局。(2016)。「個人資料去識別化」驗證標準規範研訂及推廣。取自:https://www.slideshare.net/vtaiwan/ss-58562437
17. 經濟部標準檢驗局。(2018)。專題報導-資訊大爆炸時代,標準保障使用者的資訊安全。取自:標準資料電子報 http://fsms.bsmi.gov.tw/cat/epaper/0706.html
18. 葉志良。(2016)。大數據應用下個人資料定義的檢討:以我國法院判決為例。[The Adjustment of the Definition of Personal Information in the Age of Big Data: From a Perspective of Court case]。資訊社會研究(31),1-33。
19. 葉志良。(2017)。大數據應用下個人資料的法律保護。人文與社會科學簡訊,19(1),6。
20. 廖緯民。(1996)。論資訊時代的隱私權保護-以「資訊隱私權」為中心。資訊法務透析,8(11),20-27。doi:10.7062/INFOLAW.199611.0013
21. 樊國楨、蔡昀臻。(2016)。個人資料去識別之標準化的進程初探:根基於ISO/IEC 2nd WD 20889:2016-05-30。標準與檢驗雙月刊,196,24。
22. 蔡昀臻。(2016)。巨量資料發布系統之個人資料去識別化要求事項初論。(碩士),國立交通大學,新竹市。取自:https://hdl.handle.net/11296/uk35xw
23. 鍾孝宇。(2016)。巨量資料與隱私權─個人資料保護機制的再思考。(碩士),國立政治大學,台北市。取自:https://hdl.handle.net/11296/hc3kkv

英文參考文獻

24. APEC privacy framework.(2005). Paper presented at the Asia Pacific Economic Cooperation Secretariat. http://publications.apec.org/-/media/APEC/Publications/2005/12/APEC-Privacy-Framework/05_ecsg_privacyframewk.pdf
25. Barbaro, M, & Zeller Jr., T.(2006). A Face Is Exposed for AOL Searcher No. 4417749. The New York Times Retrieved from https://www.nytimes.com/2006/08/09/technology/09aol.html
26. Bing, J. (1984). The Council of Europe Convention and OECD Guidelines on Data Protection. Mich. YBI Legal Stud., 5, 271.
27. Calo, R.(2013). Against notice skepticism in privacy(and elsewhere). Notre Dame Law Review, 87(3), 1027.
28. Cooley, T. M., & Lewis, J.(1907). A treatise on the law of torts, or the wrongs which arise independently of contract. Chicago: Callaghan & company.
29. Davis, W.(2013). AOL Settles Data Valdez Lawsuit For $5 Million. from Media Post https://www.mediapost.com/publications/article/193831/aol-settles-data-valdez-lawsuit-for-5-million.html
30. El Emam, K.(2011). Methods for the de-identification of electronic health records for genomic research. Genome Medicine, 3(4), 25. doi:10.1186/gm239
31. Garfinkel, S. L.(2015a). The Data Identifiability Spectrum. In De-identification of personal information (Ed.).
32. Garfinkel, S. L.(2015b). De-identification of personal information. Retrieved from https://nvlpubs.nist.gov/nistpubs/ir/2015/NIST.IR.8053.pdf
33. Harford, T.(2014). Big data: A big mistake? Significance, 11(5), 14-19.
34. Information and privacy commissioner of Ontario(2016). De-identification Guidelines for Structured Data.
35. King, N. J., & Forder, J.(2016). Data analytics and consumer profiling: Finding appropriate privacy principles for discovered data. Computer Law & Security Review, 32(5), 696-714. doi:https://doi.org/10.1016/j.clsr.2016.05.002
36. Li, C. (2016). A Gentle introduction to gradient boosting. URL: http://www.ccs.neu. edu/home/vip/teach/MLcourse/4_ boosting/slides/gradient_boosting.pdf.
37. Longhurst, R.(2003). Semi-structured interviews and focus groups. Key methods in geography, 3, 143-156.
38. Malin, B.(2013). A De-identification Strategy Used for Sharing One Data Provider’s Oncology Trials Data through the Project Data Sphere® Repository. Project Data Sphere.
39. Mantelero, A.(2014). The future of consumer data protection in the EU Re-thinking the “notice and consent” paradigm in the new era of predictive analytics. Computer Law Security Review, 30(6), 643-660.
40. Mayer-Schönberger, V., & Cukier, K.(2014). Big Data: A Revolution That Will Transform How We Live, Work, and Think: Houghton Mifflin Harcourt.
41. McCallister, E.(2010). Guide to protecting the confidentiality of personally identifiable information: Diane Publishing.
42. Nissenbaum, H.(2011). A Contextual Approach to Privacy Online. Daedalus, 140(4), 32-48. doi:10.1162/DAED_a_00113
43. Palinkas, L. A., Horwitz, S. M., Green, C. A., Wisdom, J. P., Duan, N., & Hoagwood, K.(2015). Purposeful Sampling for Qualitative Data Collection and Analysis in Mixed Method Implementation Research. Administration and policy in mental health, 42(5), 533-544. doi:10.1007/s10488-013-0528-y
44. Article 29 Data Protection Working Party.(2007). Opinion 4/2007 on the concept of personal data.
45. Article 29 Data Protection Working Party.(2017). Guidelines on Data Protection Impact Assessment(DPIA)and determining whether processing is “likely to result in a high risk” for the purposes of Regulation 2016/679.
46. Executive Office of the President, Munoz, C., Director, D. P. C., Megan (US Chief Technology Officer Smith (Office of Science and Technology Policy)), & DJ (Deputy Chief Technology Officer for Data Policy and Chief Data Scientist Patil (Office of Science and Technology Policy)). (2016). Big data: A report on algorithmic systems, opportunity, and civil rights. Executive Office of the President.
47. Rustici, C.(2018). GDPR Profiling and Business Practice. In Computer Law Review International(Vol. 19, pp. 34).
48. Schermer, B. W.(2011). The limits of privacy in automated profiling and data mining. Computer Law & Security Review, 27(1), 45-52. doi:https://doi.org/10.1016/j.clsr.2010.11.009
49. Schwartz, P. M., & Solove, D. J.(2011). The PII problem: Privacy and a new concept of personally identifiable information. NYUL rev., 86, 1814.
50. President`s Council of Advisors on Science Technology.(2014). Report to the President, Big Data and Privacy: A Technology Perspective. Executive Office of the President, President`s Council of Advisors on Science and Technology
51. Sweeney, L.(2002). k-anonymity: A model for protecting privacy. International Journal of Uncertainty, Fuzziness Knowledge-Based Systems, 10(05), 557-570.
52. United States v. Knotts.(1983). In US Report(Vol. 460, pp. 276): United States Supreme Court
53. Voigt, P., & Von dem Bussche, A.(2017). The EU General Data Protection Regulation(GDPR)(Vol. 18): Springer.
54. Warren, S. D., & Brandeis, L. D.(1890). The Right to Privacy. Harvard Law Review, 4(5). doi:10.2307/1321160
55. Zuiderveen Borgesius, F. J.(2016). Singling out people without knowing their names – Behavioural targeting, pseudonymous data, and the new Data Protection Regulation. Computer Law & Security Review, 32(2), 256-271. doi:https://doi.org/10.1016/j.clsr.2015.12.013
zh_TW
dc.identifier.doi (DOI) 10.6814/NCCU201900050en_US