Please use this identifier to cite or link to this item:
Title: A novel neural network approach to cDNA microarray image segmentation
Authors: Wang, Z
Zineddin, B
Liang, J
Zeng, N
Li, Y
Du, M
Cao, J
Liu, X
Keywords: Artificial neural networks;Microarray image;Adaptive segmentation;Kohonen neural networks
Issue Date: 2013
Publisher: Elsevier
Citation: Computer Methods and Programs in Biomedicine, 111(1): 189-198, Jul 2013
Abstract: Microarray technology has become a great source of information for biologists to understand the workings of DNA which is one of the most complex codes in nature. Microarray images typically contain several thousands of small spots, each of which represents a different gene in the experiment. One of the key steps in extracting information from a microarray image is the segmentation whose aim is to identify which pixels within an image represent which gene. This task is greatly complicated by noise within the image and a wide degree of variation in the values of the pixels belonging to a typical spot. In the past there have been many methods proposed for the segmentation of microarray image. In this paper, a new method utilizing a series of artificial neural networks, which are based on multi-layer perceptron (MLP) and Kohonen networks, is proposed. The proposed method is applied to a set of real-world cDNA images. Quantitative comparisons between the proposed method and commercial software GenePix(®) are carried out in terms of the peak signal-to-noise ratio (PSNR). This method is shown to not only deliver results comparable and even superior to existing techniques but also have a faster run time.
Description: This is the post-print version of the Article. The official published version can be accessed from the link below. Copyright @ 2013 Elsevier.
ISSN: 0169-2607
Appears in Collections:Publications
Computer Science
Dept of Computer Science Research Papers

Files in This Item:
File Description SizeFormat 
Fulltext.pdf439.85 kBAdobe PDFView/Open

Items in BURA are protected by copyright, with all rights reserved, unless otherwise indicated.