Please use this identifier to cite or link to this item: http://bura.brunel.ac.uk/handle/2438/4945
Full metadata record
DC FieldValueLanguage
dc.contributor.authorMorris, D-
dc.contributor.authorWang, Z-
dc.contributor.authorLiu, X-
dc.date.accessioned2011-04-04T10:25:02Z-
dc.date.available2011-04-04T10:25:02Z-
dc.date.issued2007-
dc.identifier.citationInternational Journal of Computer Mathematics, 84(5): 669-678, May 2007en_US
dc.identifier.issn0020-7160-
dc.identifier.urihttp://bura.brunel.ac.uk/handle/2438/4945-
dc.descriptionThis is the post print version of the article. The official published version can be obtained from the link below - Copyright 2007 Taylor & Francis Ltden_US
dc.description.abstractA novel algorithm for detecting microarray subgrids is proposed. The only input to the algorithm is the raw microarray image, which can be of any resolution, and the subgrid detection is performed with no prior assumptions. The algorithm consists of a series of methods of spot shape detection, spot filtering, spot spacing estimation, and subgrid shape detection. It is shown to be able to divide images of varying quality into subgrid regions with no manual interaction. The algorithm is robust against high levels of noise and high percentages of poorly expressed or missing spots. In addition, it is proved to be effective in locating regular groupings of primitives in a set of non-microarray images, suggesting potential application in the general area of image processing.en_US
dc.language.isoenen_US
dc.publisherTaylor & Francisen_US
dc.subjectMicroarrayen_US
dc.subjectGriddingen_US
dc.subjectImage filteren_US
dc.subjectShape detectionen_US
dc.subjectSub-grid detectionen_US
dc.titleMicroarray sub-grid detection: A novel algorithmen_US
dc.typeResearch Paperen_US
dc.identifier.doihttp://dx.doi.org/10.1080/00207160701242292-
Appears in Collections:Computer Science
Dept of Computer Science Research Papers

Files in This Item:
File Description SizeFormat 
Fulltext.pdf1.35 MBAdobe PDFView/Open


Items in BURA are protected by copyright, with all rights reserved, unless otherwise indicated.