Publications-Theses
Article View/Open
Publication Export
-
題名 機器學習識別古典及量子自旋模型相態
Identifying phases of classical and quantum spin models with machine learning作者 林恆毅
Lin, Heng-Yi貢獻者 林瑜琤
Lin, Yu-Cheng
林恆毅
Lin, Heng-Yi關鍵詞 深度學習
多層感知器
捲積神經網路
三角量子反鐵磁
二維古典易辛模型
deep learning
multilayer perceptron
convolutional neural network
triangular quantum Ising antiferromagnet
two-dimensional classical Ising model日期 2020 上傳時間 2-Sep-2020 12:17:00 (UTC+8) 摘要 三角易辛(Ising)反鐵磁在絕對零度因幾何挫折性而不具磁性。有趣的是,具量子效應的橫向磁場可誘發易辛反鐵磁零溫基態之有序性,產生具Z6 對稱破缺的時鐘態;這個零溫有序態可由更強的橫向磁場或有限溫度破壞。在絕對零度,一量子臨界點區分弱場下的有序時鐘態與強場下的無序順磁態。而在有限溫度,一Kosterlitz-Thouless相態區隔了低溫的時鐘態及高溫的順磁態。我們以量子蒙地卡羅方法針對許多不同溫度值及橫場值產生自旋組態,接著藉機器學習技術的多層感知器和捲積神經網路訓練機器辨識自旋組態與相態的關係,再以更多的自旋組態使神經網路識別其對應的相態。上述機器學習方法可頗精確辨識古典易辛模型的簡單相態,但對我們主要考慮的三角反鐵磁相態卻無法呈現良好的辨識力。
The triangular Ising antiferromagnet has no magnetic order down to zero temperature due to geometrical frustration. Interestingly, a weak transverse field, introducing quantum fluctuations, can induce magnetic order in the triangular antiferromagnet at zero temperature, resulting in the clock phase with a broken Z6 symmetry; this ordered clock phase can be destroyed by a strong transverse field or at finite temperature. At T=0, there is a quantum critical point separating the clock phase in weak fields and a paramagnetic phase in strong fields; at finite temperature, the antiferromagnet exhibits an extended Kosterlitz-Thouless (KT) phase intervening between the clock and paramagnetic phases. We generate spin configurations of the triangular antiferromagnet at different temperatures and transverse fields by quantum Monte Carlo (QMC) simulations. We attempt to use supervised machine learning techniques via multilayer perceptrons and convolutional neural networks to classify the phases of the antiferromagnetic system, solely based on spin configurations sampled with QMC. We find that the neural network models perform the classification task with a 70% accuracy for the triangular quantum antiferromagnet, while successfully distinguishing the classical Ising states with more than 90% accuracy.參考文獻 [1] G. H. Wannier, Phys. Rev. 79, 357 (1950).[2] Y. Jiang and T. Emig, Phys. Rev. B 73,104452 (2006).[3] S. V. Isakov and R. Moessner, Physical Review B 68 (2003).[4] M. Žukovič, L. Mižišin, and A. Bobák, Acta Physica Polonica A 126, 40 (2014).[5] 張鎮宇, 三角晶格易辛反鐵磁之量子相變, Master’s thesis, 國立政治大學,2017.[6] A. W. Sandvik and J. Kurkijärvi, Phys. Rev. B 43, 5950 (1991).[7] R. G. Melko, Stochastic Series Expansion Quantum Monte Carlo, pages 185–206, Springer, Berlin, Heidelberg, 2013.[8] G. Carleo et al., Rev. Mod. Phys. 91, 045002 (2019).[9] P. Mehta et al., Physics Reports 810, 1 (2019).[10] TensorFlow, https://www.tensorflow.org/.[11] Keras, https://keras.io/.[12] L. Bottou, Stochastic gradient descent tricks, in Neural networks: Tricks ofthe trade, pages 421–436, Springer, 2012.[13] D. P. Kingma and J. Ba, arXiv: 1412.6980 (2014).[14] D. E. Rumelhart and D. Zipser, Cognitive science 9, 75 (1985).31[15] M. A. Nielsen, Neural networks and deep learning, Determination press SanFrancisco, CA, 2015.[16] N. Srivastava, G. Hinton, A. Krizhevsky, I. Sutskever, and R. Salakhutdinov,J. Mach. Learn. Res. 15, 1929 (2014).[17] S. Ioffe and C. Szegedy, Batch normalization: Accelerating deep network trainingby reducing internal covariate shift, in Proceedings of the 32nd InternationalConference on International Conference on Machine Learning - Volume37, ICML’15, page 448, JMLR.org, 2015.[18] S. Santurkar, D. Tsipras, A. Ilyas, and A. Mądry, How does batch normalizationhelp optimization?, in Proceedings of the 32nd International Conference onNeural Information Processing Systems, NIPS’18, page 2488, Red Hook, NY,USA, 2018, Curran Associates Inc.[19] P. Mehta and D. J. Schwab, arXiv 1410.3831 (2014).[20] DeepLearning series: Convolutional Neural Networks,https://mc.ai/deeplearningseriesconvolutionalneuralnetworks/. 描述 碩士
國立政治大學
應用物理研究所
106755007資料來源 http://thesis.lib.nccu.edu.tw/record/#G0106755007 資料類型 thesis dc.contributor.advisor 林瑜琤 zh_TW dc.contributor.advisor Lin, Yu-Cheng en_US dc.contributor.author (Authors) 林恆毅 zh_TW dc.contributor.author (Authors) Lin, Heng-Yi en_US dc.creator (作者) 林恆毅 zh_TW dc.creator (作者) Lin, Heng-Yi en_US dc.date (日期) 2020 en_US dc.date.accessioned 2-Sep-2020 12:17:00 (UTC+8) - dc.date.available 2-Sep-2020 12:17:00 (UTC+8) - dc.date.issued (上傳時間) 2-Sep-2020 12:17:00 (UTC+8) - dc.identifier (Other Identifiers) G0106755007 en_US dc.identifier.uri (URI) http://nccur.lib.nccu.edu.tw/handle/140.119/131638 - dc.description (描述) 碩士 zh_TW dc.description (描述) 國立政治大學 zh_TW dc.description (描述) 應用物理研究所 zh_TW dc.description (描述) 106755007 zh_TW dc.description.abstract (摘要) 三角易辛(Ising)反鐵磁在絕對零度因幾何挫折性而不具磁性。有趣的是,具量子效應的橫向磁場可誘發易辛反鐵磁零溫基態之有序性,產生具Z6 對稱破缺的時鐘態;這個零溫有序態可由更強的橫向磁場或有限溫度破壞。在絕對零度,一量子臨界點區分弱場下的有序時鐘態與強場下的無序順磁態。而在有限溫度,一Kosterlitz-Thouless相態區隔了低溫的時鐘態及高溫的順磁態。我們以量子蒙地卡羅方法針對許多不同溫度值及橫場值產生自旋組態,接著藉機器學習技術的多層感知器和捲積神經網路訓練機器辨識自旋組態與相態的關係,再以更多的自旋組態使神經網路識別其對應的相態。上述機器學習方法可頗精確辨識古典易辛模型的簡單相態,但對我們主要考慮的三角反鐵磁相態卻無法呈現良好的辨識力。 zh_TW dc.description.abstract (摘要) The triangular Ising antiferromagnet has no magnetic order down to zero temperature due to geometrical frustration. Interestingly, a weak transverse field, introducing quantum fluctuations, can induce magnetic order in the triangular antiferromagnet at zero temperature, resulting in the clock phase with a broken Z6 symmetry; this ordered clock phase can be destroyed by a strong transverse field or at finite temperature. At T=0, there is a quantum critical point separating the clock phase in weak fields and a paramagnetic phase in strong fields; at finite temperature, the antiferromagnet exhibits an extended Kosterlitz-Thouless (KT) phase intervening between the clock and paramagnetic phases. We generate spin configurations of the triangular antiferromagnet at different temperatures and transverse fields by quantum Monte Carlo (QMC) simulations. We attempt to use supervised machine learning techniques via multilayer perceptrons and convolutional neural networks to classify the phases of the antiferromagnetic system, solely based on spin configurations sampled with QMC. We find that the neural network models perform the classification task with a 70% accuracy for the triangular quantum antiferromagnet, while successfully distinguishing the classical Ising states with more than 90% accuracy. en_US dc.description.tableofcontents 致謝 i摘要 iiiAbstract vContents vii1 緒論 12 自旋模型 32.1 二維古典易辛模型 32.2 三角量子易辛模型 43 自旋組態採樣 73.1 古典易辛模型採樣方法 73.2 量子三角反鐵磁模型採樣方法 104 深度學習自旋組態 134.1 多層感知器 154.2 捲積神經網路 174.3 古典易辛模型相態的分類 194.3.1 MLP 模型的學習結果 204.3.2 CNN 模型的學習結果 224.4 量子三角反鐵磁相態的分類 235 結論 29參考文獻 31 zh_TW dc.format.extent 4639413 bytes - dc.format.mimetype application/pdf - dc.source.uri (資料來源) http://thesis.lib.nccu.edu.tw/record/#G0106755007 en_US dc.subject (關鍵詞) 深度學習 zh_TW dc.subject (關鍵詞) 多層感知器 zh_TW dc.subject (關鍵詞) 捲積神經網路 zh_TW dc.subject (關鍵詞) 三角量子反鐵磁 zh_TW dc.subject (關鍵詞) 二維古典易辛模型 zh_TW dc.subject (關鍵詞) deep learning en_US dc.subject (關鍵詞) multilayer perceptron en_US dc.subject (關鍵詞) convolutional neural network en_US dc.subject (關鍵詞) triangular quantum Ising antiferromagnet en_US dc.subject (關鍵詞) two-dimensional classical Ising model en_US dc.title (題名) 機器學習識別古典及量子自旋模型相態 zh_TW dc.title (題名) Identifying phases of classical and quantum spin models with machine learning en_US dc.type (資料類型) thesis en_US dc.relation.reference (參考文獻) [1] G. H. Wannier, Phys. Rev. 79, 357 (1950).[2] Y. Jiang and T. Emig, Phys. Rev. B 73,104452 (2006).[3] S. V. Isakov and R. Moessner, Physical Review B 68 (2003).[4] M. Žukovič, L. Mižišin, and A. Bobák, Acta Physica Polonica A 126, 40 (2014).[5] 張鎮宇, 三角晶格易辛反鐵磁之量子相變, Master’s thesis, 國立政治大學,2017.[6] A. W. Sandvik and J. Kurkijärvi, Phys. Rev. B 43, 5950 (1991).[7] R. G. Melko, Stochastic Series Expansion Quantum Monte Carlo, pages 185–206, Springer, Berlin, Heidelberg, 2013.[8] G. Carleo et al., Rev. Mod. Phys. 91, 045002 (2019).[9] P. Mehta et al., Physics Reports 810, 1 (2019).[10] TensorFlow, https://www.tensorflow.org/.[11] Keras, https://keras.io/.[12] L. Bottou, Stochastic gradient descent tricks, in Neural networks: Tricks ofthe trade, pages 421–436, Springer, 2012.[13] D. P. Kingma and J. Ba, arXiv: 1412.6980 (2014).[14] D. E. Rumelhart and D. Zipser, Cognitive science 9, 75 (1985).31[15] M. A. Nielsen, Neural networks and deep learning, Determination press SanFrancisco, CA, 2015.[16] N. Srivastava, G. Hinton, A. Krizhevsky, I. Sutskever, and R. Salakhutdinov,J. Mach. Learn. Res. 15, 1929 (2014).[17] S. Ioffe and C. Szegedy, Batch normalization: Accelerating deep network trainingby reducing internal covariate shift, in Proceedings of the 32nd InternationalConference on International Conference on Machine Learning - Volume37, ICML’15, page 448, JMLR.org, 2015.[18] S. Santurkar, D. Tsipras, A. Ilyas, and A. Mądry, How does batch normalizationhelp optimization?, in Proceedings of the 32nd International Conference onNeural Information Processing Systems, NIPS’18, page 2488, Red Hook, NY,USA, 2018, Curran Associates Inc.[19] P. Mehta and D. J. Schwab, arXiv 1410.3831 (2014).[20] DeepLearning series: Convolutional Neural Networks,https://mc.ai/deeplearningseriesconvolutionalneuralnetworks/. zh_TW dc.identifier.doi (DOI) 10.6814/NCCU202001705 en_US