ISCAS 2023 Tutorial
Material and Physical Reservoir Computing for Beyond-CMOS Electronics
May 21, 2023
Abstract
Traditional computing is based on an engineering approach that imposes logical states and a computational model upon a physical substrate. Physical or material computing, on the other hand, harnesses and exploits the inherent, naturally-occurring properties of a physical substrate to perform a computation. To do so, reservoir computing is often used as a computing paradigm. In this tutorial, you will learn what reservoir computing is and how to use it for computing with emerging devices and fabrics. You will also learn about the current state-of-the-art and what opportunities and challenges for future research exist. The tutorial is relevant for anybody interested in beyond-CMOS and beyond-von-Neumann architectures, ML, AI, neuromorphic systems, and computing with novel devices and circuits.
Slides
- Download: teuscher_iscas23_tutorial_v3.pdf
- Version: V3.0, May 21, 2023
Recommended reading to probe further
- Tanaka et al., Recent advances in physical reservoir computing: A review, 2019, https://doi.org/10.1016/j.neunet.2019.03.005
- Nakajima, Physical reservoir computing—an introductory perspective, 2020, https://doi.org/10.35848/1347-4065/ab8d4f
- Cucchi et al., Hands-on reservoir computing: a tutorial for practical implementation, 2022, https://doi.org/10.1088/2634-4386/ac7db7
- Lukoševičius, A practical guide to applying echo state networks, 2012, https://doi.org/10.1007/978-3-642-35289-8_36
Bibliography
A list of citations used on the slides. In order of appearance.
Last updated: May 17, 2023
- Hornik, Stinchcombe, & White, 1989, https://doi.org/10.1016/0893-6080(89)90020-8
- Schaefer & Zimmerman, 2007, https://doi.org/10.1142/S0129065707001111
- Crutchfield et al., 2010, https://doi.org/10.1063/1.3492712
- Tour et al., 2002, https://doi.org/10.1109/TNANO.2002.804744
- Lukoševičius & Jaeger, 2009, https://doi.org/10.1016/j.cosrev.2009.03.005
- Maass, Natschlaeger, & Markam, 2002, https://doi.org/10.1162/089976602760407955
- Jaeger, H., The “echo state” approach to analysing and training recurrent neural networks-with an erratum note. Bonn, Germany: German National Research Center for Information Technology GMD Technical Report, 148(34), 13, 2001.
- R. Rojas, Neural Networks: A Systematic Introduction, Springer-Verlag, Berlin, 1996
- Obst et al., 2013, https://doi.org/10.1016/j.nancom.2013.08.005
- Fernando & Sajakka, 2003, https://doi.org/10.1007/978-3-540-39432-7_63
- Sillin et al., 2013, https://doi.org/10.1088/0957-4484/24/38/384004
- Vandoorne et al., 2011, https://doi.org/10.1109/TNN.2011.2161771
- Kudithipudi et al., 2016, http://doi.org/10.3389/fnins.2015.00502
- Lukoševičius, 2012, https://doi.org/10.1007/978-3-642-35289-8_36
- Cucchi et al., 2022, https://doi.org/10.1088/2634-4386/ac7db7
- Boyd & Chua, 1985, http://doi.org/10.1109/TCS.1985.1085649
- Yaeger, 2021, https://doi.org/10.1007/978-981-13-1687-6
- Tanaka et al., 2019, https://doi.org/10.1016/j.neunet.2019.03.005
- Tran & Teuscher, 2023, http://doi.org/10.1109/JETCAS.2023.3235242
- Gallucchio, Micheli, & Pedrelli, 2017, https://doi.org/10.1016/j.neucom.2016.12.089
- Wang & Li, 2016, https://doi.org/10.1109/ICPR.2016.7900035
- Snyder, Goudarzi, & Teuscher, 2013, https://doi.org/10.1103/PhysRevE.87.042808
- Hochstetter et al., 2021, https://doi.org/10.1038/s41467-021-24260-z
- Carroll, 2020, https://doi.org/10.1063/5.0038163
- Ricciardi & Milano, 2022, https://doi.org/10.3389/fnano.2022.850561
- Miller & Hickinbotham, 2018, https://doi.org/10.1007/978-3-319-65826-1_3
- Lee et al., 2023, https://arxiv.org/abs/2303.00708
- Lilak et al., 2021, http://doi.org/10.3389/fnano.2021.675792
- Milano et al., 2023, https://doi.org/10.1088/1361-6463/acb7ff
- Khan et al., 2021, https://arxiv.org/abs/2110.13849
- Rowlands et al., 2021, https://arxiv.org/abs/2103.02522
- Sinha, Kulkarni, & Teuscher, 2011, http://doi.org/10.1109/NANO.2011.6144623
- Kulkarni & Teuscher, 2012, http://doi.org/10.1145/2765491.2765531
- Du et al., 2017, https://doi.org/10.1038/s41467-017-02337-y
- Zhang et al., 2022, https://doi.org/10.1126/science.abj7943
- Tran & Teuscher, 2017, https://doi.org/10.1109/NANOARCH.2017.8053719
- Takano et al., 2018, https://doi.org/10.1364/OE.26.029424
- Coulombe, 2017, https://doi.org/10.1371/journal.pone.0178663
- Maksymov & Pototsky, 2023, https://arxiv.org/abs/2303.01801
- Goudarzi, Lakin, & Stefanovic, 2013, https://doi.org/10.1007/978-3-319-01928-4_6
- Yahiro et al., 2018, https://doi.org/10.1162/isal_a_00013
- Gauthier et al., 2021, https://doi.org/10.1038/s41467-021-25801-2
- Chahine et al., 2023, https://doi.org/10.1126/scirobotics.adc8892
Other tutorials
- IJCNN 2021 Tutorial: Reservoir Computing: Randomized Recurrent Neural Networks, Claudio Gallicchio: https://www.youtube.com/watch?v=XJg7VdN7g-0
- Physical reservoir computing: lessons learned and open questions, Joni Dambre: https://www.youtube.com/watch?v=JqdG5bA_Nlk
- Physical reservoir computing for embodied intelligence, Kohei Nakajima: https://www.youtube.com/watch?v=dw-fJNFT710
- Introduction to next generation reservoir computing, Daniel Gauthier: https://www.youtube.com/watch?v=wbH4En-k5Gs
- Reservoir Computing with Superconducting Circuits, Graham Rowlands: https://www.youtube.com/watch?v=1bWjyQ1326g