學術產出-Periodical Articles

Article View/Open

Publication Export

Google ScholarTM

政大圖書館

Citation Infomation

  • No doi shows Citation Infomation
題名 Complexity of neural networks on Fibonacci-Cayley tree
作者 班榮超
Ban, Jung-Chao
Chang, Chih-Hung
貢獻者 應數系
關鍵詞 Neural networks ; Learning problem ; Cayley tree ; Separation property, Entropy
日期 2019-05
上傳時間 28-Apr-2020 13:54:34 (UTC+8)
摘要 This paper investigates the coloring problem on Fibonacci-Cayley tree, which is a Cayley graph whose vertex set is the Fibonacci sequence. More precisely, we elucidate the complexity of shifts of finite type defined on Fibonacci-Cayley tree via an invariant called entropy. We demonstrate that computing the entropy of a Fibonacci tree-shift of finite type is equivalent to studying a nonlinear recursive system and reveal an algorithm for the computation. What is more, the entropy of a Fibonacci tree-shift of finite type is the logarithm of the spectral radius of its corresponding matrix. We apply the result to neural networks defined on Fibonacci-Cayley tree, which reflect those neural systems with neuronal dysfunction. Aside from demonstrating a surprising phenomenon that there are only two possibilities of entropy for neural networks on Fibonacci-Cayley tree, we address the formula of the boundary in the parameter space.
關聯 Journal of Algebra Combinatorics Discrete Structures and Applications, Vol.6, No.2, pp.105-122
資料類型 article
dc.contributor 應數系
dc.creator (作者) 班榮超
dc.creator (作者) Ban, Jung-Chao
dc.creator (作者) Chang, Chih-Hung
dc.date (日期) 2019-05
dc.date.accessioned 28-Apr-2020 13:54:34 (UTC+8)-
dc.date.available 28-Apr-2020 13:54:34 (UTC+8)-
dc.date.issued (上傳時間) 28-Apr-2020 13:54:34 (UTC+8)-
dc.identifier.uri (URI) http://nccur.lib.nccu.edu.tw/handle/140.119/129555-
dc.description.abstract (摘要) This paper investigates the coloring problem on Fibonacci-Cayley tree, which is a Cayley graph whose vertex set is the Fibonacci sequence. More precisely, we elucidate the complexity of shifts of finite type defined on Fibonacci-Cayley tree via an invariant called entropy. We demonstrate that computing the entropy of a Fibonacci tree-shift of finite type is equivalent to studying a nonlinear recursive system and reveal an algorithm for the computation. What is more, the entropy of a Fibonacci tree-shift of finite type is the logarithm of the spectral radius of its corresponding matrix. We apply the result to neural networks defined on Fibonacci-Cayley tree, which reflect those neural systems with neuronal dysfunction. Aside from demonstrating a surprising phenomenon that there are only two possibilities of entropy for neural networks on Fibonacci-Cayley tree, we address the formula of the boundary in the parameter space.
dc.format.extent 304665 bytes-
dc.format.mimetype application/pdf-
dc.relation (關聯) Journal of Algebra Combinatorics Discrete Structures and Applications, Vol.6, No.2, pp.105-122
dc.subject (關鍵詞) Neural networks ; Learning problem ; Cayley tree ; Separation property, Entropy
dc.title (題名) Complexity of neural networks on Fibonacci-Cayley tree
dc.type (資料類型) article