Mathematical Biosciences and Engineering, 2014, 11(1): 63-80. doi: 10.3934/mbe.2014.11.63.

Primary: 60G55, 62P10; Secondary: 94A17.

Export file:

Format

  • RIS(for EndNote,Reference Manager,ProCite)
  • BibTex
  • Text

Content

  • Citation Only
  • Citation and Abstract

The effect of interspike interval statistics on the information gainunder the rate coding hypothesis

1. The Institute of Statistical Mathematics, 10-3 Midori-cho, Tachikawa, Tokyo 190-8562
2. Institute of Physiology, Academy of Sciences of the Czech Republic, Videnska 1083, 14220 Prague

The question, how much information can be theoreticallygained from variable neuronal firing rate with respect to constantaverage firing rate is investigated.We employ the statistical concept of information based on the Kullback-Leibler divergence,and assume rate-modulated renewal processes as a model of spike trains.We show thatif the firing rate variation is sufficiently small and slow(with respect to the mean interspike interval), the information gaincan be expressed by the Fisher information.Furthermore, under certain assumptions, the smallestpossible information gain is provided by gamma-distributed interspikeintervals.The methodology is illustrated and discussed on severaldifferent statistical models of neuronal activity.
  Figure/Table
  Supplementary
  Article Metrics

Keywords neural coding.; Fisher information; Kullback-Leibler divergence; rate-modulated renewal processes; neural spike trains

Citation: Shinsuke Koyama, Lubomir Kostal. The effect of interspike interval statistics on the information gainunder the rate coding hypothesis. Mathematical Biosciences and Engineering, 2014, 11(1): 63-80. doi: 10.3934/mbe.2014.11.63

References

  • 1. Dover Publications, Inc., New York, 1966.
  • 2. Br. Med. J., 1 (1954).
  • 3. Journal of Neuroscience Methods, 105 (2001), 25-37.
  • 4. Biometrika, 68 (1981), 143-152.
  • 5. J. Roy. Stat. Soc. B, 41 (1979), 113-147.
  • 6. Eur. Phys. J. B, 24 (2001), 409-413.
  • 7. J. Neuroendocrinol., 16 (2004), 390-397.
  • 8. Brain Res., 1434 (2012), 47-61.
  • 9. Nature Neurosci., 2 (1999), 947-958.
  • 10. Neural Computation, 10 (1998), 1731-1757.
  • 11. Marcel Dekker, New York, 1989.
  • 12. IEEE Transactions on Information Theory, 14 (1968), 591-592.
  • 13. Methuen & Co., Ltd., London; John Wiley & Sons, Inc., New York, 1966.
  • 14. Neural Networks, 22 (2009), 1235-1246.
  • 15. in "Neural Information Processing Systems" (eds. J. C. Platt, D. Koller, Y. Singer and S. Roweis), Vol. 20, (2008), 329-336.
  • 16. Second edition, Probability and its Applications (New York), Springer-Verlag, New York, 2003.
  • 17. J. Neurobiology, 65 (2005), 97-114.
  • 18. John Wiley & Sons, Inc., New York, 1968.
  • 19. Biophys. J., 4 (1964), 41-68.
  • 20. Charles Griffin & Co., Ltd., London; Hafner Publishing Co., New York, N. Y., 1950.
  • 21. Biometrika, 58 (1971), 255-277.
  • 22. Neural Comput., 17 (2005), 2240-2257.
  • 23. Biol. Cybern., 92 (2005), 199-205.
  • 24. Phys. Rev. Lett., 84 (2000), 4773-4776.
  • 25. Brain Res., 1434 (2012), 123-135.
  • 26. Wiley Series in Probability and Mathematical Statistics, John Wiley & Sons, Inc., New York, 1981.
  • 27. Neural Comput., 21 (2009), 1714-1748.
  • 28. Biological Cybernetics, 77 (1997), 289-295.
  • 29. Lecture Notes in Statistics, 9, Springer-Verlag, New York-Berlin, 1982.
  • 30. John Wiley & Sons, New York, 1973.
  • 31. Neural Computation, 13 (2001), 1713-1720.
  • 32. Prentice Hall, New Jersey, 1993.
  • 33. Phys. Rev. E, 82 (2010), 026115.
  • 34. Brain Res., 1434 (2012), 136-141.
  • 35. PLoS ONE, 6 (2011), e21998.
  • 36. Entropy, 14 (2012), 1221-1233.
  • 37. in "Neural Information Processing Systems," Vol. 25, The Institute of Statistical Mathematics, 2013.
  • 38. Neural Computation, 20 (2008), 1776-1795.
  • 39. Dover Publications, Inc., Mineola, New York, 1968.
  • 40. Second edition, Springer Texts in Statistics, Springer-Verlag, New York, 1998.
  • 41. Biol. Cybern., 65 (1991), 459-467.
  • 42. Neural Comput., 20 (2008), 1325-1343.
  • 43. Neurosci. Res. Prog. Sum., 3 (1968), 405-527.
  • 44. in "Neural Information Processing Systems" (eds. Y. Bengio, D. Schuurmans, J. Lafferty, C. K. I. Williams and A. Culotta), Vol. 22, (2008), 1473-1481.
  • 45. Monographs on Applied Probability and Statistics, Chapman and Hall, London; A Halsted Press Book, John Wiley & Sons, New York, 1979.
  • 46. J. Neurosci. Methods, 181 (2009), 119-144.
  • 47. Journal of Neuroscience, 18 (1998), 10090-10104.
  • 48. Journal of Neurophysiology, 57 (1987), 147-161.
  • 49. IEEE Trans. Inf. Theory, 42 (1996), 40-47.
  • 50. John Wiley & Sons, Inc., New York; Chapman & Hill, Ltd., London, 1954.
  • 51. Proceedings of the National Academy of Sciences of the United States of America, 90 (1993), 10749-10753.
  • 52. University of Illinois Press, Urbana, Illinois, 1949.
  • 53. Biophys. J., 7 (1967), 797-826.
  • 54. J. Comput. Neurosci., 2 (1995), 149-162.
  • 55. Cambridge Studies in Mathematical Biology, 8, Cambridge University Press, Cambridge, 1988.
  • 56. Cambridge Series in Statistical and Probabilistic Mathematics, 3, Cambridge University Press, Cambridge, 1998.
  • 57. Neural Computation, 11 (1999), 75-84.

 

This article has been cited by

  • 1. Shinsuke Koyama, On the Spike Train Variability Characterized by Variance-to-Mean Power Relationship, Neural Computation, 2015, 27, 7, 1530, 10.1162/NECO_a_00748
  • 2. Stevan Pilarski, Ondrej Pokora, On the Cramér–Rao bound applicability and the role of Fisher information in computational neuroscience, Biosystems, 2015, 136, 11, 10.1016/j.biosystems.2015.07.009
  • 3. Kamil Rajdl, Petr Lansky, Stein’s neuronal model with pooled renewal input, Biological Cybernetics, 2015, 109, 3, 389, 10.1007/s00422-015-0650-x
  • 4. Marie Levakova, Massimiliano Tamborrino, Lubomir Kostal, Petr Lansky, Accuracy of rate coding: When shorter time window and higher spontaneous activity help, Physical Review E, 2017, 95, 2, 10.1103/PhysRevE.95.022310
  • 5. K. Rajdl, P. Lansky, L. Kostal, Entropy factor for randomness quantification in neuronal data, Neural Networks, 2017, 95, 57, 10.1016/j.neunet.2017.07.016
  • 6. Marie Levakova, Effect of spontaneous activity on stimulus detection in a simple neuronal model, Mathematical Biosciences and Engineering, 2016, 13, 3, 551, 10.3934/mbe.2016007
  • 7. Lubomir Kostal, Shigeru Shinomoto, Efficient information transfer by Poisson neurons, Mathematical Biosciences and Engineering, 2016, 13, 3, 509, 10.3934/mbe.2016004
  • 8. Petr Lansky, Laura Sacerdote, Cristina Zucca, The Gamma renewal process as an output of the diffusion leaky integrate-and-fire neuronal model, Biological Cybernetics, 2016, 110, 2-3, 193, 10.1007/s00422-016-0690-x
  • 9. Marie Levakova, Efficiency of rate and latency coding with respect to metabolic cost and time, Biosystems, 2017, 161, 31, 10.1016/j.biosystems.2017.06.005
  • 10. Lubomir Kostal, Petr Lansky, Michael Stiber, Statistics of inverse interspike intervals: The instantaneous firing rate revisited, Chaos: An Interdisciplinary Journal of Nonlinear Science, 2018, 28, 10, 106305, 10.1063/1.5036831
  • 11. Rimjhim Tomar, Review: Methods of Firing Rate Estimation, Biosystems, 2019, 10.1016/j.biosystems.2019.103980
  • 12. Shinuk Kim, A miRNA- and mRNA-seq-Based Feature Selection Approach for Kidney Cancer Biomakers, Cancer Informatics, 2020, 19, 117693512090830, 10.1177/1176935120908301

Reader Comments

your name: *   your email: *  

Copyright Info: 2014, Shinsuke Koyama, et al., licensee AIMS Press. This is an open access article distributed under the terms of the Creative Commons Attribution Licese (http://creativecommons.org/licenses/by/4.0)

Download full text in PDF

Export Citation

Copyright © AIMS Press All Rights Reserved