Please use this identifier to cite or link to this item:
http://bura.brunel.ac.uk/handle/2438/29245
Title: | Exploring 8-bit Arithmetic for Training Spiking Neural Networks |
Authors: | Kalganova, T Fernandez-Hart, T Knight, JC |
Issue Date: | 29-Jul-2024 |
Publisher: | Institute of Electrical and Electronics Engineers (IEEE) |
Citation: | Kalganova, T., Fernandez-Hart, T. and Knight, J.C. (2024) 'Exploring 8-bit Arithmetic for Training Spiking Neural Networks', Proceedings of the 2024 IEEE International Conference on Omni-layer Intelligent Systems (COINS), London, UK, 29-31 July, pp. 1 - 6. doi: 10.1109/COINS61597.2024.10622154. |
Abstract: | Spiking Neural Networks (SNNs) offer advantages over traditional Artificial Neural Networks (ANNs) in terms of biological plausibility, noise tolerance, and temporal processing capabilities. Additionally, SNNs can achieve significant energy efficiency when deployed on specialized neuromorphic platforms. However, the common practice of training SNNs using Back Propagation Through Time (BPTT) on GPUs before deployment is resource-intensive and hinders scalability. Although reduced precision inference with SNNs has been explored, the use of reduced precision during training remains largely unexamined. This study investigates the potential of posit arithmetic, a novel numerical format, for training SNNs on future posit-enabled accelerators. We evaluate the performance of 8-bit posit and floating-point arithmetic compared to 32-bit floating-point on two datasets. Our results show that 8-bit posits can match the performance of 32-bit floating-point arithmetic when all training components are quantised. These findings suggest that posit arithmetic could be a promising foundation for developing efficient hardware accelerators dedicated to SNN training. Such advancements are essential for reducing resource usage and enhancing energy efficiency, enabling the exploration of larger and more complex SNN architectures, and promoting their wider adoption. |
URI: | https://bura.brunel.ac.uk/handle/2438/29245 |
DOI: | https://doi.org/10.1109/COINS61597.2024.10622154 |
ISBN: | 979-8-3503-4959-7 (ebk) 979-8-3503-4960-3 (PoD) |
ISSN: | 2996-5322 |
Other Identifiers: | ORCiD: Tatiana Kalganova https://orcid.org/0000-0003-4859-7152 ORCiD: Tatiana Kalganova https://orcid.org/0000-0003-4859-7152 |
Appears in Collections: | Dept of Electronic and Electrical Engineering Research Papers |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
FullText.pdf | Copyright © 2024 Institute of Electrical and Electronics Engineers (IEEE). Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works by sending a request to pubs-permissions@ieee.org. See https://journals.ieeeauthorcenter.ieee.org/become-an-ieee-journal-author/publishing-ethics/guidelines-and-policies/post-publication-policies/ for more information | 568.38 kB | Adobe PDF | View/Open |
Items in BURA are protected by copyright, with all rights reserved, unless otherwise indicated.