Publication: Toward Reflective Spiking Neural Networks Exploiting Memristive Devices
Loading...
Official URL
Full text at PDC
Publication Date
2022-06-16
Authors
Lobov, Sergey A.
Shchanikov, Sergey
Mikhaylov, Alexey
Kazantsev, Viktor B.
Advisors (or tutors)
Editors
Journal Title
Journal ISSN
Volume Title
Publisher
Frontiers Media
Abstract
The design of modern convolutional artificial neural networks (ANNs) composed of formal neurons copies the architecture of the visual cortex. Signals proceed through a hierarchy, where receptive fields become increasingly more complex and coding sparse. Nowadays, ANNs outperform humans in controlled pattern recognition tasks yet remain far behind in cognition. In part, it happens due to limited knowledge about the higher echelons of the brain hierarchy, where neurons actively generate predictions about what will happen next, i.e., the information processing jumps from reflex to reflection. In this study, we forecast that spiking neural networks (SNNs) can achieve the next qualitative leap. Reflective SNNs may take advantage of their intrinsic dynamics and mimic complex, not reflex-based, brain actions. They also enable a significant reduction in energy consumption. However, the training of SNNs is a challenging problem, strongly limiting their deployment. We then briefly overview new insights provided by the concept of a high-dimensional brain, which has been put forward to explain the potential power of single neurons in higher brain stations and deep SNN layers. Finally, we discuss the prospect of implementing neural networks in memristive systems. Such systems can densely pack on a chip 2D or 3D arrays of plastic synaptic contacts directly processing analog information. Thus, memristive devices are a good candidate for implementing in-memory and in-sensor computing. Then, memristive SNNs can diverge from the development of ANNs and build their niche, cognitive, or reflective computations.
Description
UCM subjects
Unesco subjects
Keywords
Citation
Abbott, L. F. (1999). Lapicque’s introduction of the integrate-and-fire model neuron (1907). Brain Res. Bull. 50, 303–304. doi: 10.1016/S0361-9230(99)00161-6
Agudov, N. V., Dubkov, A. A., Safonov, A. V., Krichigin, A. V., Kharcheva, A. A., Guseinov, D. V., et al. (2021). Stochastic model of memristor based on the length of conductive region. Chaos Solitons Fract. 150:111131. doi: 10.1016/j.chaos.2021.111131
Agudov, N. V., Safonov, A. V., Krichigin, A. V., Kharcheva, A. A., Dubkov, A. A., Valenti, D., et al. (2020). Nonstationary distributions and relaxation times in a stochastic model of memristor. J. Stat. Mech. Theory Exp. 2020:24003. doi:10.1088/1742-5468/ab684a
Alexander, D. M., Trengove, C., Sheridan, P. E., and van Leeuwen, C. (2011). Generalization of learning by synchronous waves: from perceptual organization to invariant organization. Cogn. Neurodyn. 5, 113–132. doi:10.1007/s11571-010-9142-9
Altenberger, F., and Lenz, C. (2018). A non-technical survey on deep convolutional neural network architectures. arXiv [preprint]. Available online at: https://arxiv.
org/abs/1803.02129 (accessed May 26, 2022).
Amirsoleimani, A., Alibart, F., Yon, V., Xu, J., Pazhouhandeh, M. R., Ecoffey, S., et al. (2020). In-memory vector-matrix multiplication in monolithic complementary metal–oxide–semiconductor-memristor integrated circuits:
design choices, challenges, and perspectives. Adv. Intell. Syst. 2:2000115. doi: 10.1002/aisy.202000115
Ankit, A., Sengupta, A., Panda, P., and Roy, K. (2017). “RESPARC: a reconfigurable and energy-efficient architecture with memristive crossbars for deep spiking neural networks,” in Proceedings of the 54th Annual Design Automation
Conference, (Austin, TX: IEEE).
Araque, A., Parpura, V., Sanzgiri, R. P., and Haydon, P. G. (1999). Tripartite synapses: glia, the unacknowledged partner. Trends Neurosci. 22, 208–215. doi:
10.1016/s0166-2236(98)01349-6
Baek, S., Eshraghian, J. K., Thio, W., Sandamirskaya, Y., Iu, H. H., and Lu, W. D. (2020). “Live demonstration: video-to-spike conversion using a real-time retina cell network simulator,” in Proceedings of the 2020 2nd IEEE Int. Conf. Artif. Intell. Circuits System (AICAS), (Piscataway. NJ: IEEE), 131–131.
Barlow, H. B. (1961). “Possible principles underlying the transformation of sensory messages,” in Sensory Communication, ed. S. Ferne (Cambridge, MA: MIT Press).
Bayat, F. M., Prezioso, M., Chakrabarti, B., Nili, H., Kataeva, I., and Strukov, D. (2018). Implementation of multilayer perceptron network with highly uniform passive memristive crossbar circuits. Nat. Commun. 9, 1–7. doi: 10.1038/s41467-018-04482-4
Bellman, R. (1957). Dynamic Programming. Princeton, NJ: Princeton University Press.
Benito, N., Fernandez-Ruiz, A., Makarov, V. A., Makarova, J., Korovaichuk, A., and Herreras, O. (2014). Spatial modules of coherent activity in pathway-specific LFPs in the hippocampus reflect topology and different modes of presynaptic synchronization. Cereb. Cortex 24, 1738–1752. doi: 10.1093/cercor/bht022
Benito, N., Martin-Vazquez, G., Makarova, J., Makarov, V. A., and Herreras, O. (2016). The right hippocampus leads the bilateral integration of gamma-parsed lateralized information. eLife 5:e16658. doi: 10.7554/eLife.16658.001
Beyer, K., Goldstein, J., Ramakrishnan, R., and Shaft, U. (1999). “When is “nearest neighbor” meaningful?,” in Proceedings of the 7th International Conference Database Theory (ICDT), (Princeton, NJ: IEEE), 217–235.
Bhat, A. A., Mahajan, G., and Mehta, A. (2011). Learning with a network of competing synapses. PLoS One 6:e25048. doi: 10.1371/journal.pone.0025048
Bi, G. Q., and Poo, M. M. (1998). Synaptic modifications in cultured hippocampal neurons: dependence on spike timing, synaptic strength, and postsynaptic cell type. J. Neurosci. 18, 10464–10472. doi: 10.1523/JNEUROSCI.18-24-10464. 1998
Bogue, R. (2017). Domestic robots: has their time finally come? Ind. Robot Intern. J. 44, 129–136.
Bohte, S. M., Kok, J. N., and Poutré, H. L. (2002). Error-backpropagation in temporally encoded networks of spiking neurons. Neurocomputing 48, 17–37.
Bower, J. M., and Beeman, D. (1998). The Book of GENESIS: Exploring Realistic Neural Models with the General NEural SImulation System, 2nd Edn. New York, NY: Springer Verlag.
Bowers, J. (2009). On the biological plausibility of grandmother cells: implications for neural network theories in psychology and neuroscience. Psychol. Rev. 116, 220–251. doi: 10.1037/a0014462
Cai, F., Correll, J. M., Lee, S. H., Lim, Y., Bothra, V., Zhang, Z., et al. (2019). A fully integrated reprogrammable memristor–CMOS system for efficient multiply – accumulate operations. Nat. Electron. 2, 290–299. doi: 10.1038/s41928-019-0270-x
Calvo Tapia, C., Makarov, V. A., and van Leeuwen, C. (2020a). Basic principles drive self-organization of brain-like connectivity structure. Commun. Nonlinear Sci. Numer. 82:105065. doi: 10.1016/j.cnsns.2019.105065
Calvo Tapia, C., Tyukin, I., and Makarov, V. A. (2020b). Universal principles justify the existence of concept cells. Sci. Rep. 10:7889. doi: 10.1038/s41598-020-64466-7
Calvo Tapia, C., Villacorta-Atienza, J. A., Diez-Hermano, S., Khoruzhko, M., Lobov, S. A., Potapov, I., et al. (2020c). Semantic knowledge representation for strategic interactions in dynamic situations. Front. Neurorobot. 4:4. doi:10.3389/fnbot.2020.00004
Calvo Tapia, C., Tyukin, I. Y., and Makarov, V. A. (2018). Fast social-like learning of complex behaviors based on motor motifs. Phys. Rev. E . 97:052308 doi:10.1103/PhysRevE.97.052308
Cao, Y., Chen, Y., and Khosla, D. (2015). Spiking deep convolutional neural networks for energy-efficient object recognition. Int. J. Comput. Vis. 113, 54–66.
Carboni, R., and Ielmini, D. (2019). Stochastic memory devices for security and computing. Adv. Electron. Mater. 5, 1–27. doi: 10.1002/aelm.201900198
Chater, T. E., and Goda, Y. (2021). My neighbour hetero-deconstructing the mechanisms underlying heterosynaptic plasticity. Curr. Opin. Neurobiol. 67, 106–114. doi: 10.1016/j.conb.2020.10.007
Chen, S., Lou, Z., Chen, D., and Shen, G. (2018). An artificial flexible visual memory system based on an UV-motivated memristor. Adv.Mater. 30:1705400.
doi: 10.1002/adma.201705400
Chen, Y., Mai, Y., Feng, R., and Xiao, J. (2022). An adaptive threshold mechanism for accurate and efficient deep spiking convolutional neural networks. Neurocomputing 469, 189–197. doi: 10.1016/j.neucom.2021.10.080
Chou, T.-S., Bucci, L., and Krichmar, J. (2015). Learning touch preferences with a tactile robot using dopamine modulated STDP in a model of insular cortex. Front. Neurorobot. 9:6. doi: 10.3389/fnbot.2015.00006
Chua, L. O. (1971). Memristor-The missing circuit element. IEEE Trans. Circ. Theory 18, 507–519. doi: 10.1109/TCT.1971.1083337
Chua, L. O., and Kang, S. M. (1976). Memristive devices and systems. Proc. IEEE 64, 209–223. doi:10.1109/PROC.1976.10092
Cook, N. (2008). The neuron-level phenomena underlying cognition and consciousness: synaptic activity and the action potential. Neuroscience 153, 556–570. doi: 10.1016/j.neuroscience.2008.02.042
Cybenko, G. (1989). Approximation by superpositions of a sigmoidal function. Math. Contr. Signals Syst. 2, 303–314.
Dai, Z., Liu, H., Le, Q. V., and Tan, M. (2021). CoAtNet: marrying convolution and attention for all data sizes. arXiv [preprint]. Available online at: https:
//arxiv.org/abs/2106.04803 (accessed May 26, 2022).
Dearnaley, G. (1970). Electrical phenomena in amorphous oxide films. Rep. Progr. Phys. 33:1129.
Delorme, A., Gautrais, J., van Rullen, R., and Thorpe, S. (1999). SpikeNET: a simulator for modeling large networks of integrate and fire neurons. Neurocomputing 26–27, 989–996. doi: 10.1016/S0925-2312(99)00095-8
Demin, V. A., and Erokhin, V. V. (2016). Hidden symmetry shows what a memristor is. Int. J. Unconv. Comput. 12, 433–438.
Demin, V. A., Erokhin, V. V., Kashkarov, P. K., and Kovalchuk, M. V. (2014). Electrochemical model of the polyaniline based organic memristive device. J. Appl. Phys. 116:064507. doi: 10.1063/1.4893022
Demin, V. A., Nekhaev, D. V., Surazhevsky, I. A., Nikiruy, K. E., Emelyanov, A. V., Nikolaev, S. N., et al. (2021). Necessary conditions for STDP-based pattern recognition learning in a memristive spiking neural network. Neural Netw. 134, 64–75. doi: 10.1016/j.neunet.2020.11.005
Diehl, P., and Cook, M. (2015). Unsupervised learning of digit recognition using spike-timing-dependent plasticity. Front. Comput. Neurosci. 9:99. doi: 10.3389/fncom.2015.00099
Diehl, P. U., Neil, D., Binas, J., Cook, M., Liu, S.-C., and Pfeiffer, M. (2015). “Fastclassifying, high-accuracy spiking deep networks through weight and threshold balancing,” in Proceedings of the 2015 International Joint Conference on Neural Networks (IJCNN), (Piscataway, NJ: IEEE), 1–8.
Dityatev, A., Schachner, M., and Sonderegger, P. (2010). The dual role of the extracellular matrix in synaptic plasticity and homeostasis. Nat. Rev. Neurosci. 11, 735–746. doi: 10.1038/nrn2898
Donoho, D. L. (2000). High-dimensional data analysis: the curses and blessings of dimensionality. AMS Math Challeng. Lecture 1:32.
Dora, S., and Kasabov, N. (2021). Spiking neural networks for computational intelligence: an overview. Big Data Cogn. Comput. 5:67.
Draelos, T. J., Miner, N. E., Lamb, C. C., Vineyard, C. M., Carlson, K. D., James, C. D., et al. (2017). Neurogenesis deep learning. arXiv [preprint]. Available online at: https://arxiv.org/abs/1612.03770 (accessed May 26, 2022).
Dreier, J. P., Fabricius, M., Ayata, C., Sakowitz, O. W., Shuttleworth, C. W., Dohmen, C., et al. (2017). Recording, analysis, and interpretation of spreading depolarizations in neurointensive care: review and recommendations of the COSBID research group. J. Cereb. Blood Flow Metab. 37, 1595–1625. doi: 10.1177/0271678X16654496
Du, C., Ma, W., Chang, T., Sheridan, P., and Lu, W. D (2015). Biorealistic implementation of synaptic functions with oxide memristors through internal ionic dynamics. Adv. Funct. Mater. 25, 4290–4299. doi: 10.1002/adfm.201501427
Durkee, C. A., and Araque, A. (2019). Diversity and specificity of astrocyte–neuron communication. Neuroscience 396, 73–78. doi: 10.1016/j.neuroscience.2018.11.010
Edwards, J. (2005). Is consciousness only a property of individual cells? J. Conscious. Stud. 12, 60–76.
Emelyanov, A. V., Nikiruy, K. E., Demin, V. A., Rylkov, V. V., Belov, A. I., Korolev, D. S., et al. (2019). Yttria-stabilized zirconia cross-point memristive devices for
neuromorphic applications.Microelectron. Eng. 215:110988. doi: 10.1016/j.mee.2019.110988
Erokhin, V. (2020). Memristive devices for neuromorphic applications: comparative analysis. Bionanoscience 10, 834–847. doi: 10.1007/s12668-020-00795-1
Eshraghian, J. K., Baek, S., Thio, W., Sandamirskaya, Y., Iu, H. H., and Lu, W. D. (2019). A real-time retinomorphic simulator using a conductance-based discrete neuronal network. arXiv [preprint]. Available online at: https://arxiv.org/abs/2001.05430 (accessed May 26, 2022).
Eshraghian, J. K., Cho, K., Zheng, C., Nam, M., Iu, H. H. C., Lei, W., et al. (2018). Neuromorphic vision hybrid RRAM-CMOS architecture. IEEE Trans. Very Large Scale Integrat. Syst. 26, 2816–2829.
Esser, S. K., Merolla, P. A., Arthur, J. V., Cassidy, A. S., Appuswamy, R., Andreopoulos, A., et al. (2016). Convolutional networks for fast, energyefficient neuromorphic computing. Proc. Natl. Acad. Sci. U.S.A. 113, 11441–11446. doi: 10.1073/pnas.1604850113
Feldman, D. E. (2012). The spike-timing dependence of plasticity. Neuron 75, 556–571. doi: 10.1016/j.neuron.2012.08.001
Field, D. J. (1987). Relations between the statistics of natural images and the response properties of cortical cells. J. Opt. Soc. Am. A 4, 2379–2394. doi:
10.1364/josaa.4.002379
FitzHugh, R. (1961). Impulses and physiological states in theoretical models of nerve membrane. Biophys. J. 1, 445. doi: 10.1016/s0006-3495(61)86902-6
Florian, R. V. (2012). The Chronotron: a neuron that learns to fire temporally precise spike patterns. PLoS One 7:e40233. doi: 10.1371/journal.pone.0040233
Georgopoulos, A. P., Schwartz, A. B., and Kettner, R. E. (1986). Neuronal population coding of movement direction. Science 233, 1416–1419.
Gerasimova, S. A., Belov, A. I., Korolev, D. S., Guseinov, D. V., Lebedeva, A. V., Koryazhkina, M. N., et al. (2021). Stochastic memristive interface for neural signal processing. Sensors 21, 1–12. doi: 10.3390/s21165587
Gerasimova, S. A., Mikhaylov, A. N., Belov, A. I., Korolev, D. S., Gorshkov, O. N., and Kazantsev, V. B. (2017). Simulation of synaptic coupling of neuron-like generators via a memristive device. Tech. Phys. 62, 1259–1265. doi: 10.1134/S1063784217080102
Ghosh-Dastidar, S., and Adeli, H. (2007). Improved spiking neural networks for EEG classification and epilepsy and seizure detection. Integr. Comput. Aided. Eng. 14, 187–212.
Gong, P., and Van Leeuwen, C. (2009). Distributed dynamical computation in neural circuits with propagating coherent activity patterns. PLoS Comput. Biol. 5:e1000611. doi: 10.1371/journal.pcbi.1000611
Goodfellow, I., Bengio, Y., and Courville, A. (2016). Deep Learning. Cambridge, MA: The MIT Press.
Gorban, A. N., Makarov, V. A., and Tyukin, I. Y. (2019). The unreasonable effectiveness of small neural ensembles in high-dimensional brain. Phys. Life Rev. 29, 55–88. doi: 10.1016/j.plrev.2018.09.005
Gorban, A. N., Makarov, V. A., and Tyukin, I. Y. (2020). High-dimensional brain in a high-dimensional world: blessing of dimensionality. Entropy 22:82. doi:10.3390/e22010082
Gorban, A. N., Tyukin, I., Prokhorov, D., and Sofeikov, K. (2016). Approximation with random bases: pro et contra. Inf. Sci. 364–365, 129–145.
Gorban, A. N., and Tyukin, I. Y. (2018). Blessing of dimensionality: mathematical foundations of the statistical physics of data. Philos. Trans. R. Soc. A 376:20170237. doi: 10.1098/rsta.2017.0237
Gordleeva, S. Y., Ermolaeva, V. A., Kastalskiy, I. A., and Kazantsev, V. B. (2019). Astrocyte as spatiotemporal integrating detector of neuronal activity. Front. Physiol. 10:294. doi: 10.3389/fphys.2019.00294
Gordleeva, S. Y., Tsybina, Y. A., Krivonosov, M. I., Ivanchenko, M. V., Zaikin, A. A., Kazantsev, V. B., et al. (2021). Modeling working memory in a spiking neuron network accompanied by astrocytes. Front. Cell. Neurosci. 15:631485.
doi: 10.3389/fncel.2021.631485
Goriounova, N. A., Heyer, D. B., Wilbers, R., Verhoog, M. B., and Giugliano, M. (2018). Large and fast human pyramidal neurons associate with intelligence. eLife 7:e41714. doi: 10.7554/eLife.41714
Goswami, S., Deb, D., Tempez, A., Chaigneau, M., Rath, S. P., Lal, M., et al. (2020). Nanometer-scale uniform conductance switching in molecular memristors. Adv. Mater. 32, 1–11. doi: 10.1002/adma.202004370
Grill-Spector, K., Weiner, K. S., Gomez, J., Stigliani, A., and Natu, V. S. (2018). The functional neuroanatomy of face perception: from brain measurements to deep neural networks. Interface Focus 8:20180013. doi: 10.1098/rsfs.2018. 0013
Guseinov, D. V., Matyushkin, I. V., Chernyaev, N. V., Mikhaylov, A. N., and Pershin, Y. V. (2021a). Capacitive effects can make memristors chaotic. Chaos Solitons Fract. 144:110699. doi: 10.1016/j.chaos.2021.110699
Guseinov, D. V., Mikhaylov, A. N., and Pershin, Y. V (2021b). The rich dynamics of memristive devices with non-separable nonlinear response. IEEE Trans. Circ. Syst. II Express Briefs 7747, 1–5. doi: 10.1109/TCSII.2021.3115111
Gütig, R., and Sompolinsky, H. (2006). The tempotron: a neuron that learns spike timing–based decisions. Nat. Neurosci. 9, 420–428. doi: 10.1038/nn1643
Hanin, B. (2019). Universal function approximation by deep neural nets with bounded width and ReLU activations. Mathematics 7:992.
Heitmann, S., Gong, P., and Breakspear, M. (2012). A computational role for bistability and traveling waves in motor cortex. Front. Comput. Neurosci. 6:67. doi: 10.3389/fncom.2012.00067
Hellrigel, S., Jarman, N., and van Leeuwen, C. (2019). Adaptive rewiring in weighted networks. Cogn. Syst. Res. 55, 205–218.
Herculano-Houzel, S. (2012). The remarkable, yet not extraordinary, human brain as a scaled-up primate brain and its associated cost. Proc. Natl Acad. Sci. U.S.A. 109, 10661–10668. doi: 10.1073/pnas.1201895109
Hodgkin, A. L., and Huxley, A. F. (1952). A quantitative description of membrane current and its application to conduction and excitation in nerve. J. Physiol. 117, 500–544.
Hubel, D. H., andWiesel, T. N. (1968). Receptive fields and functional architecture of monkey striate cortex. J. Physiol. 195, 215–243.
Hussain, R., and Zeadally, S. (2018). Autonomous cars: research results, issues and future challenges. IEEE Comm. Surv. Tutor. 21, 1275–1313. doi: 10.1109/COMST.2018.2869360
Ielmini, D., and Waser, R. (2016). Resistive Switching: From Fundamentals of Nanoionic Redox Processes to Memristive Device Applications. Weinheim: WILEY-VCH.
Ignatov, M., Hansen, M., Ziegler, M., and Kohlstedt, H. (2016). Synchronization of two memristively coupled van der Pol oscillators. Appl. Phys. Lett. 108. doi: 10.1063/1.4942832
Ignatov, M., Ziegler, M., Hansen, M., and Kohlstedt, H. (2017). Memristive stochastic plasticity enables mimicking of neural synchrony: memristive circuit emulates an optical illusion. Sci. Adv. 3:e1700849. doi: 10.1126/sciadv.1700849
Imagenet (2022). Available online at: https://paperswithcode.com/sota/imageclassification-on-imagenet (accessed July 01, 2022).
Indiveri, G., Linares-Barranco, B., Hamilton, T. J., van Schaik, A., Etienne-Cummings, R., Delbruck, T., et al. (2011). Neuromorphic silicon neuron circuits. Front. Neurosci. 5:73. doi: 10.3389/fnins.2011.00073
Izhikevich, E. M. (2003). Simple model of spiking neurons. IEEE Trans. Neural Networks 14, 1569–1572.
Izhikevich, E. M. (2005). Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Cambridge, MA: MIT press.
Izhikevich, E. M. (2007). Solving the distal reward problem through linkage of STDP and dopamine signaling. Cereb. Cortex 17, 2443–2452. doi: 10.1093/cercor/bhl152
James, W. (1890). “The mind-stuff theory,” in The Principles of Psychology, ed. W. James (New York, NY: Henry Holt and Co), 145–182.
Jo, S. H., Chang, T., Ebong, I., Bhadviya, B. B., Mazumder, P., and Lu, W. (2010). Nanoscale memristor device as synapse in neuromorphic systems. Nano Lett. 10, 1297–1301. doi: 10.1021/nl904092h
Kazantsev, V., Gordleeva, S., Stasenko, S., and Dityatev, A. (2012). A homeostatic model of neuronal firing governed by feedback signals from the extracellular matrix. PLoS One 7:e41646. doi: 10.1371/journal.pone.0041646
Keane, A., and Gong, P. (2015). Propagating waves can explain irregular neural dynamics. J. Neurosci. 35, 1591–1605. doi: 10.1523/JNEUROSCI.1669-14. 2015
Keck, T., Hübener, M., and Bonhoeffer, T. (2017). Interactions between synaptic homeostatic mechanisms: an attempt to reconcile BCM theory, synaptic scaling, and changing excitation/inhibition balance. Curr. Opin. Neurobiol. 43, 87–93. doi: 10.1016/j.conb.2017.02.003
Khan, A., Sohail, A., Zahoora, U., and Qureshi, A. S. (2020). A survey of the recent architectures of deep convolutional neural networks. Artif. Intell. Rev. 53, 5455–5516.
Kim, J., Pershin, Y. V., Yin, M., Datta, T., and Di Ventra, M. (2020). An experimental proof that resistance-switching memory cells are not memristors. Adv. Electron. Mater. 6, 1–6. doi: 10.1002/aelm.202000010
Kim, S., Du, C., Sheridan, P.,Ma, W., Choi, S., and Lu, W. D. (2015). Experimental demonstration of a second-order memristor and its ability to biorealistically implement synaptic plasticity. Nano Lett. 15, 2203–2211. doi: 10.1021/acs.nanolett.5b00697
Koch, C., and Segev, I. (1999). Methods in Neuronal Modeling: From Ions to Networks, 2nd Edn. Cambridge, MA: MIT Press.
Kreinovich, V., and Kosheleva, O. (2021). Limit theorems as blessing of dimensionality: neural-oriented overview. Entropy 23:501. doi: 10.3390/e23050501
Krizhevsky, A., Sutskever, I., and Hinton, G. E. (2012). Imagenet classification with deep convolutional neural networks. Adv. Neur. Inf. Proces. Syst. 25, 1097–1105.
Kumar, S.,Williams, R. S., and Wang, Z. (2020). Third-order nanocircuit elements for neuromorphic engineering. Nature 585, 518–523. doi: 10.1038/s41586-020-2735-5
Kutter, E. F., Bostroem, J., Elger, C. E., Mormann, F., and Nieder, A. (2018). Single neurons in the human brain encode numbers. Neuron 100, 753–761. doi: 10.1016/j.neuron.2018.08.036
Laskar, M. N. U., Giraldo, L. G. S., and Schwartz, O. (2018). Correspondence of deep neural networks and the brain for visual textures. arXiv [preprint]. Available online at: https://arxiv.org/abs/1806.02888 (accessed May 26, 2022).
Lazarevich, I., Stasenko, S., Rozhnova, M., Pankratova, E., Dityatev, A., and Kazantsev, V. (2020). Activity-dependent switches between dynamic regimes of extracellular matrix expression. PLoS One 15:e0227917. doi: 10.1371/journal.pone.0227917
Lebedev, A. E., Solovyeva, K. P., and Dunin-Barkowski, W. L. (2020). “The large-scale symmetry learning applying Pavlov principle,” in Proceedings of the International Conference on Neuroinformatics, (Cham: Springer), 405–411.
Lebedev, M. A., and Nicolelis, M. A. L. (2017). Brain-machine interfaces: from basic science to neuroprostheses and neurorehabilitation. Physiol. Rev. 97, 767–837.
doi: 10.1152/physrev.00027.2016
LeCun, Y., Boser, B., Denker, J. S., Henderson, D., Howard, R. E., Hubbard, W., et al. (1989). Backpropagation applied to handwritten zip code recognition. Neural Comput. 1, 541–551.
Ledoux, M. (2005). “The concentration of measure phenomenon,” in Mathematical Surveys & Monographs, ed. N. H. Mustafa (Providence, RI: AMS).
Lee, C., Sarwar, S. S., Panda, P., Srinivasan, G., and Roy, K. (2020). Enabling spikebased backpropagation for training deep neural network architectures. Front. Neurosci. 14:119. doi: 10.3389/fnins.2020.00119
Lee, E. K., Gerla, M., Pau, G., Lee, U., and Lim, J. H. (2016). Internet of vehicles: from intelligent grid to autonomous cars and vehicular fogs. Int. J. Distrib.
Sensor Netw. 12:1550147716665500.
Lee, S. H., Zhu, X., and Lu, W. D. (2020). Nanoscale resistive switching devices for memory and computing applications. Nano Res. 13, 1228–1243.
Legenstein, R., Naeger, C., and Maass, W. (2005). What can a neuron learn with spike-timing-dependent plasticity? Neural Comput. 17, 2337–2382. doi: 10.1162/0899766054796888
Lennie, P. (2003). The cost of cortical computation. Curr. Biol. 13, 493–497. doi: 10.1016/S0960-9822(03)00135-0
Li, C., Hu, M., Li, Y., Jiang, H., Ge, N., Montgomery, E., et al. (2018a). Analogue signal and image processing with large memristor crossbars. Nat. Electron. 1, 52–59. doi: 10.1038/s41928-017-0002-z
Li, C., Belkin, D., Li, Y., Yan, P., Hu, M., Ge, N., et al. (2018b). Efficient and self-adaptive in-situ learning in multilayer memristor neural networks. Nat. Commun. 9, 7–14. doi: 10.1038/s41467-018-04484-2
Li, X., Tang, J., Zhang, Q., Gao, B., Yang, J. J., Song, S., et al. (2020). Powerefficient neural network with artificial dendrites. Nat. Nanotechnol. 15, 776–782. doi: 10.1038/s41565-020-0722-5
Lin, Y., Li, J., Lin, M., and Chen, J. (2014). A new nearest neighbor classifier via fusing neighborhood information. Neurocomputing 143, 164–169.
Lindsay, G. W., Rigotti, M., Warden, M. R., Miller, E. K., and Fusi, S. (2017). Hebbian learning in a random network captures selectivity properties of prefrontal cortex. J. Neurosci. 37, 11021–11036. doi: 10.1523/JNEUROSCI. 1222-17.2017
Lobov, S., Simonov, A., Kastalskiy, I., and Kazantsev, V. (2016). Network response synchronization enhanced by synaptic plasticity. Eur. Phys. J. Spec. Top. 225, 29–39.
Lobov, S. A., Chernyshov, A. V., Krilova, N. P., Shamshin, M. O., and Kazantsev, V. B. (2020a). Competitive learning in a spiking neural network: towards an intelligent pattern classifier. Sensors 20:500. doi: 10.3390/s20020500
Lobov, S. A., Mikhaylov, A. N., Shamshin, M.,Makarov, V. A., and Kazantsev, V. B. (2020b). Spatial properties of STDP in a self-learning spiking neural network enable controlling a mobile robot. Front. Neurosci. 14:88. doi: 10.3389/fnins.
2020.00088
Lobov, S. A., Mikhaylov, A. N., Berdnikova, E. S., Makarov, V. A., and Kazantsev, V. B. (2021a). Spatial computing in structured spiking neural networks with a robotic embodiment. arXiv [preprint]. Available online at: https://arxiv.org/abs/2112.07150 doi: 10.48550/arXiv.2112.07150 (accessed May 26, 2022).
Lobov, S. A., Zharinov, A. I., Makarov, V. A., and Kazantsev, V. B. (2021b). Spatial memory in a spiking neural network with robot embodiment. Sensors 21:2678.
doi: 10.3390/s21082678
Lobov, S. A., Zharinov, A. I., Semenova, O., and Kazantsev, V. B. (2021c). “Topological classification of population activity in spiking neural network,” in Proceedings of the Saratov Fall Meeting 2020: Computations and Data Analysis: from Molecular Processes to Brain Functions (SPIE), ed. D. E. Postnov (Bellingham: SPIE).
Lobov, S. A., Zhuravlev, M. O., Makarov, V. A., and Kazantsev, V. B. (2017). Noise enhanced signaling in STDP driven spiking-neuron network.Math. Model. Nat. Phenom. 12, 109–124.
Lu, Z., Pu, H., Wang, F., Hu, Z., and Wang, L. (2017). The expressive power of neural networks: a view from the width. Int. Adv. Neural Inf. Proc. Syst. 30, 6231–6239.
Mackenzie, A. (2013). Programming subjects in the regime of anticipation: software studies and subjectivity. Subjectivity 6, 391–405.
Makarov, V. A., and Villacorta-Atienza, J. A. (2011). “Compact internal representation as a functional basis for protocognitive exploration of dynamic environments,” in Recurrent Neural Networks for Temporal Data Processing, ed.
H. Cardot (London: Intech).
Makarova, J., Makarov, V. A., and Herreras, O. (2010). Generation of sustained field potentials by gradients of polarization within single neurons: a macroscopic
model of spreading depression. J. Neurophysiol. 103, 2446–2457. doi: 10.1152/jn.01045.2009
Markram, H., Lübke, J., Frotscher, M., and Sakmann, B. (1997). Regulation of synaptic efficacy by coincidence of postsynaptic APs and EPSPs. Science 275, 213–215. doi: 10.1126/science.275.5297.213
Matsukatova, A. N., Emelyanov, A. V., Minnekhanov, A. A., Nesmelov, A. A., Vdovichenko, A. Y., Chvalun, S. N., et al. (2020). Resistive switching kinetics and second-order effects in parylene-based memristors. Appl. Phys. Lett.
117:243501. doi: 10.1063/5.0030069
McCulloch, W. S., and Pitts, W. (1943). A logical calculus of the ideas immanent in nervous activity. Bull. Math. Biophys. 5, 115–133.
McKennoch, S., Liu, D., and Bushnell, L. G. (2006). “Fast modifications of the SpikeProp algorithm,” in Proceedings of the 2006 IEEE International Joint Conference on Neural Networks, (Vancouver, BC: IEEE), 3970–3977.
Mehonic, A., Sebastian, A., Rajendran, B., Simeone, O., Vasilaki, E., and Kenyon, A. J. (2020). Memristors-from in-memory computing, deep learning acceleration, and spiking neural networks to the future of neuromorphic and bio-inspired computing. Adv. Intell. Syst. 2:2000085. doi: 10.1002/aisy.202000085
Merolla, P. A., Arthur, J. V., Alvarez-Icaza, R., Cassidy, A. S., Sawada, J., Akopyan, F., et al. (2014). A million spiking-neuron integrated circuit with a scalable
communication network and interface. Science 345, 668–673. doi: 10.1126/science.1254642
Mikhaylov, A., Pimashkin, A., Pigareva, Y., Gerasimova, S., Gryaznov, E., Shchanikov, S., et al. (2020). Neurohybrid memristive CMOS-integrated systems for biosensors and neuroprosthetics. Front. Neurosci. 14:358. doi: 10.3389/fnins.2020.00358
Mikhaylov, A. N., Guseinov, D. V., Belov, A. I., Korolev, D. S., Shishmakova, V. A., Koryazhkina, M. N., et al. (2021). Stochastic resonance in a metal-oxide memristive device. Chaos Solitons Fract. 144:110723. doi: 10.1016/j.chaos.2021.
110723
Mohemmed, A., Schliebs, S., Matsuda, S., and Kasabov, N. (2012). SPAN: spike pattern association neuron for learning spatio-temporal spike patterns. Int. J. Neural Syst. 22:1250012. doi: 10.1142/S0129065712500128
Moravec, H. (1988). Mind Children: The Future of Robot and Human Intelligence. Cambridge, MA: Harvard University Press.
Mormann, F., Dubois, J., Kornblith, S., Milosavljevic, M., Cerf, M., Ison, M., et al. (2011). A category-specific response to animals in the right human amygdala. Nat. Neurosci. 14, 1247–1249. doi: 10.1038/nn.2899
Morrison, A., Diesmann, M., and Gerstner, W. (2008). Phenomenological models of synaptic plasticity based on spike timing. Biol. Cybern. 98, 459–478. doi:
10.1007/s00422-008-0233-1
Mostafa, H. (2018). Supervised learning based on temporal coding in spiking neural networks. IEEE Trans. Neural Netw. Learn. Syst. 29, 3227–3235. doi: 10.1109/TNNLS.2017.2726060
Mozafari, M., Ganjtabesh, M., Nowzari-Dalini, A., Thorpe, S. J., and Masquelier, T. (2018). Bio-inspired digit recognition using reward-modulated spike-timingdependent
plasticity in deep convolutional networks. arXiv [preprint]. Available online at: https://arxiv.org/abs/1804.00227 (accessed May 26, 2022).
Muller, L., Chavane, F., Reynolds, J., and Sejnowski, T. J. (2018). Cortical travelling waves: mechanisms and computational principles. Nat. Rev. Neurosci. 19, 255–
268. doi: 10.1038/nrn.2018.20
Naoumenko, D., and Gong, P. (2019). Complex dynamics of propagating waves in a two-dimensional neural field. Front. Comput. Neurosci. 13:50. doi: 10.3389/fncom.2019.00050
Neftci, E. O., Mostafa, H., and Zenke, F. (2019). Surrogate gradient learning in spiking neural networks: bringing the power of gradient-based optimization to spiking neural networks. IEEE Signal Process. Mag. 36, 51–63. doi: 10.1109/
MSP.2019.2931595
Neil, D., Pfeiffer, M., and Liu, S.-C. (2016). “Learning to be efficient: algorithms for training low-latency, low-compute deep spiking neural networks,” in Proceedings of the 31st Ann. ACM Symp. Appl. Comp. SAC ’16, (New York, NY:
Association for Computing Machinery), 293–298.
Ohno, T., Hasegawa, T., Tsuruoka, T., Terabe, K., Gimzewski, J. K., and Aono, M. (2011). Short-term plasticity and long-term potentiation mimicked in single inorganic synapses. Nat. Mater. 10, 591–595. doi: 10.1038/nmat3054
Olshausen, B. A., and Field, D. J. (1997). Sparse coding with an overcomplete basis set: a strategy employed by V1? Vis. Res. 37, 3311–3325. doi: 10.1016/s0042-6989(97)00169-7
Olshausen, B. A., and Field, D. J. (2004). Sparse coding of sensory inputs. Curr. Opin. Neurobiol. 14, 481–487. doi: 10.1016/j.conb.2004.07.007
Palmer, J. H. C., and Gong, P. (2014). Associative learning of classical conditioning as an emergent property of spatially extended spiking neural circuits with synaptic plasticity. Front. Comput. Neurosci. 8:79. doi: 10.3389/fncom.2014.00079
Panda, P., Aketi, S. A., and Roy, K. (2020). Toward scalable, efficient, and accurate deep spiking neural networks with backward residual connections, stochastic
softmax, and hybridization. Front. Neurosci. 14:653. doi: 10.3389/fnins.2020.00653
Panda, P., Allred, J. M., Ramanathan, S., and Roy, K. (2018). ASP: learning to forget with adaptive synaptic plasticity in spiking neural networks. IEEE J. Emerg. Sel.
Top. Circ. Syst. 8, 51–64. doi: 10.1109/JETCAS.2017.2769684
Papandroulidakis, G., Vourkas, I., Abusleme, A., Sirakoulis, G. C., and Rubio, A. (2017). Crossbar-based memristive logic-in-memory architecture. IEEE Trans. Nanotechnol. 16, 491–501.
Perea, G., and Araque, A. (2007). Astrocytes potentiate transmitter release at single hippocampal synapses. Science 317, 1083–1086. doi: 10.1126/science.1144640
Pershin, Y. V., and Slipko, V. A. (2019a). Bifurcation analysis of a TaO memristor model. J. Phys. D. Appl. Phys. 52:505304. doi: 10.1088/1361-6463/AB4537
Pershin, Y. V., and Slipko, V. A. (2019b). Dynamical attractors of memristors and their networks. Europhys. Lett. 125, 1–6. doi: 10.1209/0295-5075/125/20002
Pestov, V. (2013). Is the k-NN classifier in high dimensions affected by the curse of dimensionality? Comput. Math. Appl. 65, 1427–1437.
Ponulak, F. (2005). ReSuMe-New Supervised Learning Method for Spiking Neural Networks. Pozna?: Poznan University.
Ponulak, F., and Hopfield, J. (2013). Rapid, parallel path planning by propagating wavefronts of spiking neural activity. Front. Comput. Neurosci. 7:98. doi: 10.
3389/fncom.2013.00098
Ponulak, F., and Kasinski, A. (2010). Supervised learning in spiking neural networks with ReSuMe: sequence learning, classification, and spike shifting. Neural Comput. 22, 467–510. doi: 10.1162/neco.2009.11-08-901
Pouget, A., Dayan, P., and Zemel, R. (2000). Information processing with population codes. Nat. Rev. Neurosci. 1, 125–132. doi: 10.1038/35039062
Prezioso, M., Mahmoodi, M. R., Bayat, F. M., Nili, H., Kim, H., Vincent, A., et al. (2018). Spike-timing-dependent plasticity learning of coincidence detection with passively integrated memristive circuits. Nat. Commun. 9, 1–8. doi: 10.1038/s41467-018-07757-y
Qin, Y. F., Bao, H., Wang, F., Chen, J., Li, Y., and Miao, X. S. (2020). Recent progress on memristive convolutional neural networks for edge intelligence. Adv. Intell. Syst. 2:2000114.
Querlioz, D., Bichler, O., Dollfus, P., and Gamrat, C. (2013). Immunity to device variations in a spiking neural network with memristive nanodevices. IEEE Trans. Nanotechnol. 12, 288–295.
Quian Quiroga, R. (2012). Concept cells: the building blocks of declarative memory functions. Nat. Rev. Neurosci. 13:587. doi: 10.1038/nrn3251
Quian Quiroga, R. (2019). Akakhievitch revisited. Comment on ”The unreasonable effectiveness of small neural ensembles in high-dimensional brain” by Alexander N. Gorban et al. Phys. Life Rev. 28, 111–114.
Quian Quiroga, R., Reddy, L., Kreiman, G., Koch, C., and Fried, I. (2005). Invariant visual representation by single neurons in the human brain. Nature 435, 1102–1107. doi: 10.1038/nature03687
Quiroga, R. Q., and Panzeri, S. (2013). Principles of Neural Coding. Boca Raton, FL: CRC Press.
Rentzeperis, I., Laquitaine, S., and van Leeuwen, C. (2022). Adaptive rewiring of random neural networks generates convergent–divergent units. Commun. Nonlin. Sci. Numer. Simul. 107:106135.
Robbins, H., and Monro, S. (1951). A stochastic approximation method. Ann. Math. Stat. 22:400.
Rosenblatt, F. (1958). The perceptron: a probabilistic model for information storage and organization in the brain. Psychol. Rev. 65, 386–408. doi: 10.1037/h0042519
Ruckauer, B., Lungu, I.-A., Hu, Y., Pfeiffer, M., and Liu, S.-C. (2017). Conversion of continuous-valued deep networks to efficient event-driven networks for image classification. Front. Neurosci. 11:682. doi: 10.3389/fnins.2017.00682
Rumelhart, D. E., Hinton, G. E., and Williams, R. J. (1986). Learning representations by back-propagating errors. Nature 323, 533–536.
Russakovsky, O., Deng, J., Su, H., Krause, J., Satheesh, S., Ma, S., et al. (2015). Imagenet large scale visual recognition challenge. Int. J. Comp. Vis. 115, 211–252.
Ryabova, M. A., Antonov, D. A., Kruglov, A. V., Antonov, I. N., Filatov, D. O., and Gorshkov, O. N. (2021). In situ investigation of individual filament growth in conducting bridge memristor by contact scanning capacitance microscopy.
J. Phys. Conf. Ser. 2086:012205. doi: 10.1088/1742-6596/2086/1/012205
Santello, M., Toni, N., and Volterra, A. (2019). Astrocyte function from information processing to cognition and cognitive impairment. Nat. Neurosci. 22, 154–166. doi: 10.1038/s41593-018-0325-8
Sattin, D., Magnani, F. G., Bartesaghi, L., Caputo, M., Fittipaldo, A. V., Cacciatore, M., et al. (2021). Theoretical models of consciousness: a scoping review. Brain
Sci. 11, 535. doi: 10.3390/brainsci11050535
Schmidhuber, J. (2015). Deep learning in neural networks: an overview. Neur. Netw. 61, 85–117. doi: 10.1016/j.neunet.2014.09.003
Schuman, C. D., Kulkarni, S. R., Parsa, M., Mitchell, J. P., Date, P., and Kay, B. (2022). Opportunities for neuromorphic computing algorithms and applications. Nat. Comput. Sci. 2, 10–19. doi: 10.1038/s43588-021-00184-y
Sevush, S. (2006). Single-neuron theory of consciousness. J. Theor. Biol. 238, 704–725. doi: 10.1016/j.jtbi.2005.06.018
Shchanikov, S., Bordanov, I., Belov, A., Korolev, D., Shamshin, M., Gryaznov, E., et al. (2021). “Memristive concept of a high-dimensional neuron,” in Proceedings of the 2021 Third IEEE Inter. Conf. Neurotechn. Neurointerf. (CNN),
(Piscataway, NJ: IEEE), 96–99.
Sherrington, C. (1940). Man on His Nature. Cambridge, MA: Cambridge Univ. Press.
Shrestha, S. B., and Orchard, G. (2018). “SLAYER: spike layer error reassignment in time,” in Advances in Neural Information Processing Systems, eds S. Bengio, H. Wallach, H. Larochelle, K. Grauman, N. Cesa-Bianchi, and R. Garnett (Red Hook, NY: Curran Associates, Inc.).
Shrestha, S. B., and Song, Q. (2015). Adaptive learning rate of SpikeProp based on weight convergence analysis. Neur. Netw. 63, 185–198. doi: 10.1016/j.neunet.2014.12.001
Silva, S. M., and Ruano, A. E. (2005). “Application of Levenberg-Marquardt method to the training of spiking neural networks,” in Proceedings of the 2005 Int. Conf. Neur. Netw. Brain, (Piscataway, NJ: IEEE), 1354–1358.
Sjöström, P. J., Turrigiano, G. G., and Nelson, S. B. (2001). Rate, timing, and cooperativity jointly determine cortical synaptic plasticity. Neuron 32, 1149–1164. doi: 10.1016/s0896-6273(01)00542-6
Song, S., Miller, K. D., and Abbott, L. F. (2000). Competitive Hebbian learning through spike-timing-dependent synaptic plasticity. Nat. Neurosci. 3:919.
Sporea, I., and Grüning, A. (2013). Supervised learning in multilayer spiking neural networks. Neural Comput. 25, 473–509.
Strukov, D. B., Snider, G. S., Stewart, D. R., and Williams, R. S. (2008). The missing memristor found. Nature 453, 80–83. doi: 10.1038/nature06932
Strukov, D. B., and Williams, R. S. (2009). Four-dimensional address topology for circuits with stacked multilayer crossbar arrays. Proc. Natl. Acad. Sci. U.S.A. 106,
20155–20158. doi: 10.1073/pnas.0906949106
Taherkhani, A., Belatreche, A., Li, Y., Cosma, G., Maguire, L. P., and McGinnity, T. M. (2020). A review of learning in biologically plausible spiking neural networks. Neur. Netw. 122, 253–272. doi: 10.1016/j.neunet.2019.09.036
Taherkhani, A., Belatreche, A., Li, Y., and Maguire, L. P. (2018). A supervised learning algorithm for learning precise timing of multiple spikes in multilayer spiking neural networks. IEEE Trans. Neural Networks Learn. Syst. 29, 5394–
5407. doi: 10.1109/TNNLS.2018.2797801
Tavanaei, A., Ghodrati, M., Kheradpisheh, S. R., Masquelier, T., and Maida, A. (2019). Deep learning in spiking neural networks. Neur. Netw. 111, 47–63.
Tavanaei, A., and Maida, A. S. (2015). A minimal spiking neural network to rapidly train and classify handwritten digits in binary and 10-digit tasks. Int. J. Adv. Res.
Artif. Intell. 4, 1–8.
Teyler, T. J., and Discenna, P. (1984). The topological anatomy of the hippocampus: a clue to its function. Brain Res. Bull. 12, 711–719. doi: 10.1016/0361-9230(84)90152-7
Tolman, E. C., and Honzik, C. H. (1930). Introduction and removal of reward, and maze performance in rats. Univ. Calif. Public Psychol. 4, 257–275.
Turrigiano, G. G. (2017). The dialectic of Hebb and homeostasis. Philos. Trans. R. Soc. B Biol. Sci. 372:20160258. doi: 10.1098/rstb.2016.0258
Tyukin, I. Y., Gorban, A. N., Calvo, C., Makarova, J., and Makarov, V. A. (2019). High-dimensional brain: a tool for encoding and rapid learning of memories by single neurons. Bull. Math. Biol. 81, 4856–4888. doi: 10.1007/s11538-018-
0415-5
Valdez, A. B., Papesh, M. H., Treiman, D. M., Smith, K. A., Goldinger, S. D., and Steinmetz, P. N. (2015). Distributed representation of visual objects by single neurons in the human brain. J. Neurosci. 35, 5180–5186.
Vasileiadis, N., Ntinas, V., Sirakoulis, G. C., and Dimitrakis, P. (2021a). Inmemory-computing realization with a photodiode/memristor based vision sensor. Materials 14, 1–15. doi: 10.3390/ma14185223
Vasileiadis, N., Ntinas, V., Fyrigos, I., Karamani, R., Ioannou-Sougleridis, V., Normand, P., et al. (2021b). “A new 1p1r image sensor with in-memory computing properties based on silicon nitride devices,” in Proceedings of the 2021 IEEE Int. Symp. Circuits and Systems (ISCAS), (Piscataway, NJ: IEEE).
Villacorta-Atienza, J. A., Calvo, C., and Makarov, V. A. (2015). Prediction-for-CompAction: navigation in social environments using generalized cognitive maps. Biol. Cybern. 109, 307–320. doi: 10.1007/s00422-015-0644-8
Villacorta-Atienza, J. A., Calvo Tapia, C., Diez-Hermano, S., Sanchez-Jimenez, A., Lobov, S., Krilova, N., et al. (2021). Static internal representation of dynamic
situations reveals time compaction in human cognition. J. Adv. Res. 28, 111–125. doi: 10.1016/j.jare.2020.08.008
Villacorta-Atienza, J. A., and Makarov, V. A. (2013). Wave-processing of long-scale information by neuronal chains. PLoS One 8:e0057440. doi: 10.1371/journal.pone.0057440
Villacorta-Atienza, J. A., Velarde, M. G., and Makarov, V. A. (2010). Compact internal representation of dynamic situations: neural network implementing the causality principle. Biol. Cybern. 103, 285–329. doi: 10.1007/s00422-010-0398-2
Vongehr, S., and Meng, X. (2015). The missing memristor has not been found. Sci. Rep. 5, 1–7. doi: 10.1038/srep11657
Wang, Z., Li, C., Song, W., Rao, M., Belkin, D., Li, Y., et al. (2019). Reinforcement learning with analogue memristor arrays. Nat. Electron. 2, 115–124. doi: 10.1038/s41928-019-0221-6
Wang, Z., Wu, H., Burr, G. W., Hwang, C. S., Wang, K. L., Xia, Q., et al. (2020). Resistive switching materials for information processing. Nat. Rev. Mater. 5, 173–195. doi: 10.1038/s41578-019-0159-3
Xia, Q., and Yang, J. J. (2019). Memristive crossbar arrays for brain-inspired computing. Nat. Mater. 18, 309–323.
Xin, J., and Embrechts, M. J. (2001). “Supervised learning with spiking neural networks,” in Proceedings of the IJCNN’01. Int. Joint Conf. Neur. Network (Cat. No.01CH37222), (Piscataway, NJ: IEEE), 1772–1777.
Yao, P., Wu, H., Gao, B., Tang, J., Zhang, Q., Zhang, W., et al. (2020). Fully hardware-implemented memristor convolutional neural network. Nature 577, 641–646. doi: 10.1038/s41586-020-1942-4
Yin, B., Corradi, F., and Bohté, S. M. (2021). Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nat.Mach. Intell. 3, 905–913. doi: 10.1038/s42256-021-00397-w
Yin, J., and Yuan, Q. (2015). Structural homeostasis in the nervous system: a balancing act for wiring plasticity and stability. Front. Cell. Neurosci. 8:439. doi: 10.3389/fncel.2014.00439
Zahari, F., Pérez, E.,Mahadevaiah, M. K., Kohlstedt, H., Wenger, C., and Ziegler, M. (2020). Analogue pattern recognition with stochastic switching binary CMOSintegrated
memristive devices. Sci. Rep. 10:14450. doi: 10.1038/s41598-020-71334-x
Zamarreño-Ramos, C., Camuñas-Mesa, L. A., Perez-Carrasco, J. A.,Masquelier, T., Serrano-Gotarredona, T., and Linares-Barranco, B. (2011). On spike-timingdependent-plasticity, memristive devices, and building a self-learning visual
cortex. Front. Neurosci. 5:26. doi: 10.3389/fnins.2011.00026
Zambrano, D., Nusselder, R., Scholte, H. S., and Bohté, S. M. (2019). Sparse computation in adaptive spiking neural networks. Front. Neurosci. 12:987. doi: 10.3389/fnins.2018.00987
Zenke, F., and Vogels, T. P. (2021). The remarkable robustness of surrogate gradient learning for instilling complex function in spiking neural networks. Neural Comput. 33, 899–925. doi: 10.1162/neco_a_01367
Zhou, Y., Wu, H., Gao, B., Wu, W., Xi, Y., Yao, P., et al. (2019). Associative memory for image recovery with a high-performance memristor array. Adv. Funct. Mater. 29:1900155. doi: 10.1002/adfm.201900155