Please use this identifier to cite or link to this item: https://ah.lib.nccu.edu.tw/handle/140.119/131638
DC FieldValueLanguage
dc.contributor.advisor林瑜琤zh_TW
dc.contributor.advisorLin, Yu-Chengen_US
dc.contributor.author林恆毅zh_TW
dc.contributor.authorLin, Heng-Yien_US
dc.creator林恆毅zh_TW
dc.creatorLin, Heng-Yien_US
dc.date2020en_US
dc.date.accessioned2020-09-02T04:17:00Z-
dc.date.available2020-09-02T04:17:00Z-
dc.date.issued2020-09-02T04:17:00Z-
dc.identifierG0106755007en_US
dc.identifier.urihttp://nccur.lib.nccu.edu.tw/handle/140.119/131638-
dc.description碩士zh_TW
dc.description國立政治大學zh_TW
dc.description應用物理研究所zh_TW
dc.description106755007zh_TW
dc.description.abstract三角易辛(Ising)反鐵磁在絕對零度因幾何挫折性而不具磁性。有趣的是,具量子效應的橫向磁場可誘發易辛反鐵磁零溫基態之有序性,產生具Z6 對稱破缺的時鐘態;這個零溫有序態可由更強的橫向磁場或有限溫度破壞。在絕對零度,一量子臨界點區分弱場下的有序時鐘態與強場下的無序順磁態。而在有限溫度,一Kosterlitz-Thouless相態區隔了低溫的時鐘態及高溫的順磁態。我們以量子蒙地卡羅方法針對許多不同溫度值及橫場值產生自旋組態,接著藉機器學習技術的多層感知器和捲積神經網路訓練機器辨識自旋組態與相態的關係,再以更多的自旋組態使神經網路識別其對應的相態。上述機器學習方法可頗精確辨識古典易辛模型的簡單相態,但對我們主要考慮的三角反鐵磁相態卻無法呈現良好的辨識力。zh_TW
dc.description.abstractThe triangular Ising antiferromagnet has no magnetic order down to zero temperature due to geometrical frustration. Interestingly, a weak transverse field, introducing quantum fluctuations, can induce magnetic order in the triangular antiferromagnet at zero temperature, resulting in the clock phase with a broken Z6 symmetry; this ordered clock phase can be destroyed by a strong transverse field or at finite temperature. At T=0, there is a quantum critical point separating the clock phase in weak fields and a paramagnetic phase in strong fields; at finite temperature, the antiferromagnet exhibits an extended Kosterlitz-Thouless (KT) phase intervening between the clock and paramagnetic phases. We generate spin configurations of the triangular antiferromagnet at different temperatures and transverse fields by quantum Monte Carlo (QMC) simulations. We attempt to use supervised machine learning techniques via multilayer perceptrons and convolutional neural networks to classify the phases of the antiferromagnetic system, solely based on spin configurations sampled with QMC. We find that the neural network models perform the classification task with a 70% accuracy for the triangular quantum antiferromagnet, while successfully distinguishing the classical Ising states with more than 90% accuracy.en_US
dc.description.tableofcontents致謝 i\n摘要 iii\nAbstract v\nContents vii\n1 緒論 1\n2 自旋模型 3\n2.1 二維古典易辛模型 3\n2.2 三角量子易辛模型 4\n3 自旋組態採樣 7\n3.1 古典易辛模型採樣方法 7\n3.2 量子三角反鐵磁模型採樣方法 10\n4 深度學習自旋組態 13\n4.1 多層感知器 15\n4.2 捲積神經網路 17\n4.3 古典易辛模型相態的分類 19\n4.3.1 MLP 模型的學習結果 20\n4.3.2 CNN 模型的學習結果 22\n4.4 量子三角反鐵磁相態的分類 23\n5 結論 29\n參考文獻 31zh_TW
dc.format.extent4639413 bytes-
dc.format.mimetypeapplication/pdf-
dc.source.urihttp://thesis.lib.nccu.edu.tw/record/#G0106755007en_US
dc.subject深度學習zh_TW
dc.subject多層感知器zh_TW
dc.subject捲積神經網路zh_TW
dc.subject三角量子反鐵磁zh_TW
dc.subject二維古典易辛模型zh_TW
dc.subjectdeep learningen_US
dc.subjectmultilayer perceptronen_US
dc.subjectconvolutional neural networken_US
dc.subjecttriangular quantum Ising antiferromagneten_US
dc.subjecttwo-dimensional classical Ising modelen_US
dc.title機器學習識別古典及量子自旋模型相態zh_TW
dc.titleIdentifying phases of classical and quantum spin models with machine learningen_US
dc.typethesisen_US
dc.relation.reference[1] G. H. Wannier, Phys. Rev. 79, 357 (1950).\n[2] Y. Jiang and T. Emig, Phys. Rev. B 73,104452 (2006).\n[3] S. V. Isakov and R. Moessner, Physical Review B 68 (2003).\n[4] M. Žukovič, L. Mižišin, and A. Bobák, Acta Physica Polonica A 126, 40 (2014).\n[5] 張鎮宇, 三角晶格易辛反鐵磁之量子相變, Master’s thesis, 國立政治大學,\n2017.\n[6] A. W. Sandvik and J. Kurkijärvi, Phys. Rev. B 43, 5950 (1991).\n[7] R. G. Melko, Stochastic Series Expansion Quantum Monte Carlo, pages 185–\n206, Springer, Berlin, Heidelberg, 2013.\n[8] G. Carleo et al., Rev. Mod. Phys. 91, 045002 (2019).\n[9] P. Mehta et al., Physics Reports 810, 1 (2019).\n[10] TensorFlow, https://www.tensorflow.org/.\n[11] Keras, https://keras.io/.\n[12] L. Bottou, Stochastic gradient descent tricks, in Neural networks: Tricks of\nthe trade, pages 421–436, Springer, 2012.\n[13] D. P. Kingma and J. Ba, arXiv: 1412.6980 (2014).\n[14] D. E. Rumelhart and D. Zipser, Cognitive science 9, 75 (1985).\n31\n[15] M. A. Nielsen, Neural networks and deep learning, Determination press San\nFrancisco, CA, 2015.\n[16] N. Srivastava, G. Hinton, A. Krizhevsky, I. Sutskever, and R. Salakhutdinov,\nJ. Mach. Learn. Res. 15, 1929 (2014).\n[17] S. Ioffe and C. Szegedy, Batch normalization: Accelerating deep network training\nby reducing internal covariate shift, in Proceedings of the 32nd International\nConference on International Conference on Machine Learning - Volume\n37, ICML’15, page 448, JMLR.org, 2015.\n[18] S. Santurkar, D. Tsipras, A. Ilyas, and A. Mądry, How does batch normalization\nhelp optimization?, in Proceedings of the 32nd International Conference on\nNeural Information Processing Systems, NIPS’18, page 2488, Red Hook, NY,\nUSA, 2018, Curran Associates Inc.\n[19] P. Mehta and D. J. Schwab, arXiv 1410.3831 (2014).\n[20] DeepLearning series: Convolutional Neural Networks,\nhttps://mc.ai/deeplearningseriesconvolutionalneuralnetworks/.zh_TW
dc.identifier.doi10.6814/NCCU202001705en_US
item.grantfulltextrestricted-
item.openairetypethesis-
item.openairecristypehttp://purl.org/coar/resource_type/c_46ec-
item.cerifentitytypePublications-
item.fulltextWith Fulltext-
Appears in Collections:學位論文
Files in This Item:
File Description SizeFormat
500701.pdf4.53 MBAdobe PDF2View/Open
Show simple item record

Google ScholarTM

Check

Altmetric

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.