Please use this identifier to cite or link to this item: http://bura.brunel.ac.uk/handle/2438/24711
Title: Deep learning for automatic optical inspection and quality evaluation of semiconductor and optoelectronic manufacturing
Authors: Abu Ebayyeh, Abd Al Rahman M.
Advisors: Mousavi, A
Katsou, E
Keywords: Industrial automation;Computer vision;Machine learning;Electronics industry;Artificial intelligence
Issue Date: 2022
Publisher: Brunel University London
Abstract: Electronics industry is one of the fastest evolving, innovative, and most competitive industries. In order to meet the high consumption demands on electronics components, quality standards of the products must be well-maintained. Automatic optical inspection (AOI) is one of the nondestructive techniques used in quality inspection of various products. This technique is considered robust and can replace human inspectors who are subjected to dull and fatigue in performing inspection tasks. A fully automated optical inspection system consists of hardware and software setups. Hardware setup include image sensor and illumination settings and is responsible to acquire the digital image, while the software part implements an inspection algorithm to extract the features of the acquired images and classify them into defected and non-defected based on the user requirements. This research aims to explore methods used in literature for quality and AOI of the basic building block of electronic and optoeletronic devices which are semiconductor and optoelectronic wafers. Deep Learning (DL) techniques has proven its ability in extracting the feature automatically within their hidden layers saving the export knowledge and efforts when compared to conventional image processing and feature extraction techniques. DL can also preform classification based on the examples learned by network, while conventional rulebased classification uses conditional statements to provide a decision. Based on the previous advantages, DL networks were used to be as a decision making system for classifying defect patterns and anomalies in semiconductor and optoelectronic wafers. Open source WM-811K were used in our investigation of semiconductor wafer bin maps (WBMs) patterns. It was observed that the dataset is imbalanced and some classes do not contain enough samples for training, therefore, Deep Convolutional Generative Adversarial Network (DCGAN) were used to upsample and increase the dataset. A novel improved capsule network WaferCaps were also proposed to classify the defect patterns according to eight classes. The performance of our proposed DCGAN and WaferCaps was compared with different deep learning models such as the original Capsule Network (CapsNet), Convolutional Neural Network (CNN), and Multi-layer Perceptron (MLP). In all of our experiment, WM-811K dataset was used for the data upsampling and training. The proposed approach has shown an effective performance in generating new synthetic data and classify them with training accuracy of 99.59%, validation accuracy of 97.53% and test accuracy of 91.4%. For analysing optoelectronic wafer, we focused our attention on the waveguide of quantum cascade laser (QCL) in the wafer. Due to the lack of sufficient dirt and defect samples, data augmentation approach was used to increase the number of images. Decision fusion approach was used to integrate the classification performance of CNN and WaferCaps since they performed the best in different classes. The proposed approach was compared to similar DL algorithms and it achieved an overall accuracy of 98.5%.
Description: This thesis was submitted for the degree of Doctor of Philosophy and awarded by Brunel University London
URI: http://bura.brunel.ac.uk/handle/2438/24711
Appears in Collections:Electronic and Computer Engineering
Dept of Electronic and Computer Engineering Research Papers

Files in This Item:
File Description SizeFormat 
FulltextThesis.pdfEmbargoed until 16/06/202326.64 MBAdobe PDFView/Open


Items in BURA are protected by copyright, with all rights reserved, unless otherwise indicated.