fbpx
维基百科

转移熵

转移熵(英語:transfer entropy)是测量两个随机过程之间有向(时间不对称)信息转移量的一种非参数统计量[1][2][3]过程X到另一个过程Y的转移熵可定义为:在已知Y过去值的情况下,了解X的过去值所能减少Y未来值不确定性的程度。更具体地说,假定)表示两个随机过程,并用香农熵来度量信息量,则转移熵可定义为:

其中H ( X ) 表示X的香农熵。此外,还可以使用其他类型的度量(例如雷尼熵英语Rényi entropy)对上述定义进行扩展。[3][4]

转移熵可看作一种条件互信息英语Conditional mutual information[5][6],其条件为受影响变量的历史值

向量自回归过程而言,转移熵可简化为格兰杰因果关系[7] 因而,转移熵适用于非线性信号分析等格兰杰因果关系的模型假设不成立的场合。[8][9]然而,它通常需要更多的样本才能进行准确估计。[10]熵公式中的概率可以使用分箱、最近邻等不用方法来估计,或为了降低复杂性而使用非均匀嵌入方法。[11]虽然转移熵的原始定义是建立在双变量分析英语Bivariate analysis基础上的,但后来也扩展到多变量分析中。这种扩展可以以其他潜在源变量为条件[12] ,或考虑从一组源进行转移[13],不过这些都需要更多的样本。

转移熵被用于估计神经元功能连接[13][14][15]社交网络中的社会影响[8]以及武装冲突事件之间的统计因果关系等。[16]转移熵是有向信息英语Directed information的有限形式,于1990年由詹姆斯·马西英语James Massey[17]定义为,其中表示向量则表示 。有向信息在描述具有或没有反馈的通信信道的基本极限(信道容量)中起着关键作用。[18][19]

参见 编辑

参考文献 编辑

  1. ^ Schreiber, Thomas. Measuring information transfer. Physical Review Letters. 1 July 2000, 85 (2): 461–464. Bibcode:2000PhRvL..85..461S. PMID 10991308. S2CID 7411376. arXiv:nlin/0001042 . doi:10.1103/PhysRevLett.85.461. 
  2. ^ Seth, Anil. Granger causality. Scholarpedia. 2007, 2 (7): 1667. Bibcode:2007SchpJ...2.1667S. doi:10.4249/scholarpedia.1667 . 
  3. ^ 3.0 3.1 Hlaváčková-Schindler, Katerina; Palus, M; Vejmelka, M; Bhattacharya, J. Causality detection based on information-theoretic approaches in time series analysis. Physics Reports. 1 March 2007, 441 (1): 1–46. Bibcode:2007PhR...441....1H. CiteSeerX 10.1.1.183.1617 . doi:10.1016/j.physrep.2006.12.004. 
  4. ^ Jizba, Petr; Kleinert, Hagen; Shefaat, Mohammad. Rényi's information transfer between financial time series. Physica A: Statistical Mechanics and Its Applications. 2012-05-15, 391 (10): 2971–2989. Bibcode:2012PhyA..391.2971J. ISSN 0378-4371. S2CID 51789622. arXiv:1106.5913 . doi:10.1016/j.physa.2011.12.064 (英语). 
  5. ^ Wyner, A. D. A definition of conditional mutual information for arbitrary ensembles. Information and Control. 1978, 38 (1): 51–59. doi:10.1016/s0019-9958(78)90026-8 . 
  6. ^ Dobrushin, R. L. General formulation of Shannon's main theorem in information theory. Uspekhi Mat. Nauk. 1959, 14: 3–104. 
  7. ^ Barnett, Lionel. Granger Causality and Transfer Entropy Are Equivalent for Gaussian Variables. Physical Review Letters. 1 December 2009, 103 (23): 238701. Bibcode:2009PhRvL.103w8701B. PMID 20366183. S2CID 1266025. arXiv:0910.4514 . doi:10.1103/PhysRevLett.103.238701. 
  8. ^ 8.0 8.1 Ver Steeg, Greg; Galstyan, Aram. Information transfer in social media. Proceedings of the 21st international conference on World Wide Web (WWW '12). ACM: 509–518. 2012. Bibcode:2011arXiv1110.2724V. arXiv:1110.2724 . 
  9. ^ Lungarella, M.; Ishiguro, K.; Kuniyoshi, Y.; Otsu, N. Methods for quantifying the causal structure of bivariate time series. International Journal of Bifurcation and Chaos. 1 March 2007, 17 (3): 903–921. Bibcode:2007IJBC...17..903L. CiteSeerX 10.1.1.67.3585 . doi:10.1142/S0218127407017628. 
  10. ^ Pereda, E; Quiroga, RQ; Bhattacharya, J. Nonlinear multivariate analysis of neurophysiological signals.. Progress in Neurobiology. Sep–Oct 2005, 77 (1–2): 1–37. Bibcode:2005nlin.....10077P. PMID 16289760. S2CID 9529656. arXiv:nlin/0510077 . doi:10.1016/j.pneurobio.2005.10.003. 
  11. ^ Montalto, A; Faes, L; Marinazzo, D. MuTE: A MATLAB Toolbox to Compare Established and Novel Estimators of the Multivariate Transfer Entropy.. PLOS ONE. Oct 2014, 9 (10): e109462. Bibcode:2014PLoSO...9j9462M. PMC 4196918 . PMID 25314003. doi:10.1371/journal.pone.0109462 . 
  12. ^ Lizier, Joseph; Prokopenko, Mikhail; Zomaya, Albert. Local information transfer as a spatiotemporal filter for complex systems. Physical Review E. 2008, 77 (2): 026110. Bibcode:2008PhRvE..77b6110L. PMID 18352093. S2CID 15634881. arXiv:0809.3275 . doi:10.1103/PhysRevE.77.026110. 
  13. ^ 13.0 13.1 Lizier, Joseph; Heinzle, Jakob; Horstmann, Annette; Haynes, John-Dylan; Prokopenko, Mikhail. Multivariate information-theoretic measures reveal directed information structure and task relevant changes in fMRI connectivity. Journal of Computational Neuroscience. 2011, 30 (1): 85–107. PMID 20799057. S2CID 3012713. doi:10.1007/s10827-010-0271-2. 
  14. ^ Vicente, Raul; Wibral, Michael; Lindner, Michael; Pipa, Gordon. Transfer entropy—a model-free measure of effective connectivity for the neurosciences. Journal of Computational Neuroscience. February 2011, 30 (1): 45–67. PMC 3040354 . PMID 20706781. doi:10.1007/s10827-010-0262-3. 
  15. ^ Shimono, Masanori; Beggs, John. Functional clusters, hubs, and communities in the cortical microconnectome. Cerebral Cortex. October 2014, 25 (10): 3743–57. PMC 4585513 . PMID 25336598. doi:10.1093/cercor/bhu252. 
  16. ^ Kushwaha, Niraj; Lee, Edward D. Discovering the mesoscale for chains of conflict. PNAS Nexus. July 2023, 2 (7). ISSN 2752-6542. PMC 10392960 . PMID 37533894. doi:10.1093/pnasnexus/pgad228. 
  17. ^ Massey, James. Causality, Feedback And Directed Information (ISITA). 1990. CiteSeerX 10.1.1.36.5688 . 
  18. ^ Permuter, Haim Henry; Weissman, Tsachy; Goldsmith, Andrea J. Finite State Channels With Time-Invariant Deterministic Feedback. IEEE Transactions on Information Theory. February 2009, 55 (2): 644–662. S2CID 13178. arXiv:cs/0608070 . doi:10.1109/TIT.2008.2009849. 
  19. ^ Kramer, G. Capacity results for the discrete memoryless network. IEEE Transactions on Information Theory. January 2003, 49 (1): 4–21. doi:10.1109/TIT.2002.806135. 

转移熵, 英語, transfer, entropy, 是测量两个随机过程之间有向, 时间不对称, 信息转移量的一种非参数统计量, 过程x到另一个过程y的可定义为, 在已知y过去值的情况下, 了解x的过去值所能减少y未来值不确定性的程度, 更具体地说, 假定x, displaystyle, 和y, displaystyle, displaystyle, mathbb, 表示两个随机过程, 并用香农熵来度量信息量, 则可定义为, displaystyle, rightarrow, left, right, left,. 转移熵 英語 transfer entropy 是测量两个随机过程之间有向 时间不对称 信息转移量的一种非参数统计量 1 2 3 过程X到另一个过程Y的转移熵可定义为 在已知Y过去值的情况下 了解X的过去值所能减少Y未来值不确定性的程度 更具体地说 假定X t displaystyle X t 和Y t displaystyle Y t t N displaystyle t in mathbb N 表示两个随机过程 并用香农熵来度量信息量 则转移熵可定义为 T X Y H Y t Y t 1 t L H Y t Y t 1 t L X t 1 t L displaystyle T X rightarrow Y H left Y t mid Y t 1 t L right H left Y t mid Y t 1 t L X t 1 t L right 其中H X 表示X的香农熵 此外 还可以使用其他类型的熵度量 例如雷尼熵 英语 Renyi entropy 对上述定义进行扩展 3 4 转移熵可看作一种条件互信息 英语 Conditional mutual information 5 6 其条件为受影响变量的历史值Y t 1 t L displaystyle Y t 1 t L T X Y I Y t X t 1 t L Y t 1 t L displaystyle T X rightarrow Y I Y t X t 1 t L mid Y t 1 t L 对向量自回归过程而言 转移熵可简化为格兰杰因果关系 7 因而 转移熵适用于非线性信号分析等格兰杰因果关系的模型假设不成立的场合 8 9 然而 它通常需要更多的样本才能进行准确估计 10 熵公式中的概率可以使用分箱 最近邻等不用方法来估计 或为了降低复杂性而使用非均匀嵌入方法 11 虽然转移熵的原始定义是建立在双变量分析 英语 Bivariate analysis 基础上的 但后来也扩展到多变量分析中 这种扩展可以以其他潜在源变量为条件 12 或考虑从一组源进行转移 13 不过这些都需要更多的样本 转移熵被用于估计神经元的功能连接 13 14 15 社交网络中的社会影响 8 以及武装冲突事件之间的统计因果关系等 16 转移熵是有向信息 英语 Directed information 的有限形式 于1990年由詹姆斯 马西 英语 James Massey 17 定义为I X n Y n i 1 n I X i Y i Y i 1 displaystyle I X n to Y n sum i 1 n I X i Y i Y i 1 其中X n displaystyle X n 表示向量X 1 X 2 X n displaystyle X 1 X 2 X n Y n displaystyle Y n 则表示Y 1 Y 2 Y n displaystyle Y 1 Y 2 Y n 有向信息在描述具有或没有反馈的通信信道的基本极限 信道容量 中起着关键作用 18 19 参见 编辑互信息 因果关系 潜在结果模型参考文献 编辑 Schreiber Thomas Measuring information transfer Physical Review Letters 1 July 2000 85 2 461 464 Bibcode 2000PhRvL 85 461S PMID 10991308 S2CID 7411376 arXiv nlin 0001042 nbsp doi 10 1103 PhysRevLett 85 461 Seth Anil Granger causality Scholarpedia 2007 2 7 1667 Bibcode 2007SchpJ 2 1667S doi 10 4249 scholarpedia 1667 nbsp 3 0 3 1 Hlavackova Schindler Katerina Palus M Vejmelka M Bhattacharya J Causality detection based on information theoretic approaches in time series analysis Physics Reports 1 March 2007 441 1 1 46 Bibcode 2007PhR 441 1H CiteSeerX 10 1 1 183 1617 nbsp doi 10 1016 j physrep 2006 12 004 Jizba Petr Kleinert Hagen Shefaat Mohammad Renyi s information transfer between financial time series Physica A Statistical Mechanics and Its Applications 2012 05 15 391 10 2971 2989 Bibcode 2012PhyA 391 2971J ISSN 0378 4371 S2CID 51789622 arXiv 1106 5913 nbsp doi 10 1016 j physa 2011 12 064 英语 Wyner A D A definition of conditional mutual information for arbitrary ensembles Information and Control 1978 38 1 51 59 doi 10 1016 s0019 9958 78 90026 8 nbsp Dobrushin R L General formulation of Shannon s main theorem in information theory Uspekhi Mat Nauk 1959 14 3 104 Barnett Lionel Granger Causality and Transfer Entropy Are Equivalent for Gaussian Variables Physical Review Letters 1 December 2009 103 23 238701 Bibcode 2009PhRvL 103w8701B PMID 20366183 S2CID 1266025 arXiv 0910 4514 nbsp doi 10 1103 PhysRevLett 103 238701 8 0 8 1 Ver Steeg Greg Galstyan Aram Information transfer in social media Proceedings of the 21st international conference on World Wide Web WWW 12 ACM 509 518 2012 Bibcode 2011arXiv1110 2724V arXiv 1110 2724 nbsp Lungarella M Ishiguro K Kuniyoshi Y Otsu N Methods for quantifying the causal structure of bivariate time series International Journal of Bifurcation and Chaos 1 March 2007 17 3 903 921 Bibcode 2007IJBC 17 903L CiteSeerX 10 1 1 67 3585 nbsp doi 10 1142 S0218127407017628 Pereda E Quiroga RQ Bhattacharya J Nonlinear multivariate analysis of neurophysiological signals Progress in Neurobiology Sep Oct 2005 77 1 2 1 37 Bibcode 2005nlin 10077P PMID 16289760 S2CID 9529656 arXiv nlin 0510077 nbsp doi 10 1016 j pneurobio 2005 10 003 Montalto A Faes L Marinazzo D MuTE A MATLAB Toolbox to Compare Established and Novel Estimators of the Multivariate Transfer Entropy PLOS ONE Oct 2014 9 10 e109462 Bibcode 2014PLoSO 9j9462M PMC 4196918 nbsp PMID 25314003 doi 10 1371 journal pone 0109462 nbsp Lizier Joseph Prokopenko Mikhail Zomaya Albert Local information transfer as a spatiotemporal filter for complex systems Physical Review E 2008 77 2 026110 Bibcode 2008PhRvE 77b6110L PMID 18352093 S2CID 15634881 arXiv 0809 3275 nbsp doi 10 1103 PhysRevE 77 026110 13 0 13 1 Lizier Joseph Heinzle Jakob Horstmann Annette Haynes John Dylan Prokopenko Mikhail Multivariate information theoretic measures reveal directed information structure and task relevant changes in fMRI connectivity Journal of Computational Neuroscience 2011 30 1 85 107 PMID 20799057 S2CID 3012713 doi 10 1007 s10827 010 0271 2 Vicente Raul Wibral Michael Lindner Michael Pipa Gordon Transfer entropy a model free measure of effective connectivity for the neurosciences Journal of Computational Neuroscience February 2011 30 1 45 67 PMC 3040354 nbsp PMID 20706781 doi 10 1007 s10827 010 0262 3 Shimono Masanori Beggs John Functional clusters hubs and communities in the cortical microconnectome Cerebral Cortex October 2014 25 10 3743 57 PMC 4585513 nbsp PMID 25336598 doi 10 1093 cercor bhu252 Kushwaha Niraj Lee Edward D Discovering the mesoscale for chains of conflict PNAS Nexus July 2023 2 7 ISSN 2752 6542 PMC 10392960 nbsp PMID 37533894 doi 10 1093 pnasnexus pgad228 Massey James Causality Feedback And Directed Information ISITA 1990 CiteSeerX 10 1 1 36 5688 nbsp Permuter Haim Henry Weissman Tsachy Goldsmith Andrea J Finite State Channels With Time Invariant Deterministic Feedback IEEE Transactions on Information Theory February 2009 55 2 644 662 S2CID 13178 arXiv cs 0608070 nbsp doi 10 1109 TIT 2008 2009849 Kramer G Capacity results for the discrete memoryless network IEEE Transactions on Information Theory January 2003 49 1 4 21 doi 10 1109 TIT 2002 806135 取自 https zh wikipedia org w index php title 转移熵 amp oldid 79673998, 维基百科,wiki,书籍,书籍,图书馆,

文章

,阅读,下载,免费,免费下载,mp3,视频,mp4,3gp, jpg,jpeg,gif,png,图片,音乐,歌曲,电影,书籍,游戏,游戏。