Please use this identifier to cite or link to this item: http://bura.brunel.ac.uk/handle/2438/29870
Title: Approximate Computing: Concepts, Architectures, Challenges, Applications, and Future Directions
Authors: Dalloo, AM
Humaidi, AJ
Al Mhdawi AK
Al-Raweshidy, H
Keywords: approximate computing;approximate programming language;approximate memory;circuit-level;approximate machine learning;deep learning;approximate logic synthesis;statistical and neuromorphic computing;cross layer and end-to-end approximate computing
Issue Date: 25-Sep-2024
Publisher: Institute of Electrical and Electronics Engineers (IEEE)
Citation: Dalloo, A.M. et al. (2024) 'Approximate Computing: Concepts, Architectures, Challenges, Applications, and Future Directions', IEEE Access, 0 (early access), pp. 1 - 70. doi: 10.1109/ACCESS.2024.3467375.
Abstract: The unprecedented progress in computational technologies led to a substantial proliferation of artificial intelligence applications, notably in the era of big data and IoT devices. In the face of exponential data growth and complex computations, conventional computing encounters substantial obstacles pertaining to energy efficiency, computational speed, and area. Due to the diminishing advantages of technology scaling and increased demands from computing workloads, novel design techniques are required to increase performance and decrease power consumption. Approximate computing, nowadays considered a promising paradigm, achieves considerable improvements in overhead cost reduction (i.e., energy, area, and latency) at the expense of a modest (i.e., still acceptable) deterioration in application accuracy. Therefore, approximate computing at different levels (Data, Circuit, Architecture, and Software) has been attracted by the research and industrial communities. This paper presents a comprehensive review of the major research areas of different levels of approximate computing by exploring their underlying principles, potential benefits, and associated trade-offs. This is a burgeoning field that seeks to balance computational efficiency with acceptable accuracy. The paper highlights opportunities where these techniques can be effectively applied, such as in applications where perfect accuracy is not a strict requirement. This paper presents assessments of applying approximate computing techniques in various applications, especially machine learning algorithms (ML) and IoT. Furthermore, this review underscores the challenges encountered in implementing approximate computing techniques and highlights potential future research avenues. The anticipation is that this survey will stimulate further discourse and underscore the necessity for continued research and development to fully exploit the potential of approximate computing.
URI: https://bura.brunel.ac.uk/handle/2438/29870
DOI: https://doi.org/10.1109/ACCESS.2024.3467375
ISSN: 2169-3536
Other Identifiers: ORCiD: Ayad M. Dalloo https://orcid.org/0000-0002-8748-1630
ORCiD: Ammar K. Al Mhdawi https://orcid.org/0000-0003-1806-1189
ORCiD: Hamed Al-Raweshidy https://orcid.org/0000-0002-3702-8192
Appears in Collections:Dept of Electronic and Electrical Engineering Research Papers

Files in This Item:
File Description SizeFormat 
FullText.pdfCopyright © 2024 The Author(s). This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/2.9 MBAdobe PDFView/Open


This item is licensed under a Creative Commons License Creative Commons