Please use this identifier to cite or link to this item: http://bura.brunel.ac.uk/handle/2438/29008
Title: Improving classifier-based effort-aware software defect prediction by reducing ranking errors
Authors: Guo, Y
Shepperd, M
Li, N
Keywords: software defect prediction;effort-aware;ranking error;ranking strategy;software engineering (cs.SE)
Issue Date: 18-Jun-2024
Publisher: Association for Computing Machinery (ACM)
Citation: Guo, Y., Shepperd, M. and Li, N. (2024) 'Improving classifier-based effort-aware software defect prediction by reducing ranking errors', International Conference on Evaluation and Assessment in Software Engineering (EASE) 2024, Salerno, Italy, 18-21 June, pp. 1 - 10.
Abstract: Context: Software defect prediction utilizes historical data to direct software quality assurance resources to potentially problematic components. Effort-aware (EA) defect prediction prioritizes more bug-like components by taking cost-effectiveness into account. In other words, it is a ranking problem, however, existing ranking strategies based on classification, give limited consideration to ranking errors. Objective: Improve the performance of classifier-based EA ranking methods by focusing on ranking errors. Method: We propose a ranking score calculation strategy called EA-Z which sets a lower bound to avoid near-zero ranking errors. We investigate four primary EA ranking strategies with 16 classification learners, and conduct the experiments for EA-Z and the other four existing strategies. Results: Experimental results from 72 data sets show EA-Z is the best ranking score calculation strategy in terms of Recall@20% and Popt when considering all 16 learners. For particular learners, imbalanced ensemble learner UBag-svm and UBst-rf achieve top performance with EA-Z. Conclusion: Our study indicates the effectiveness of reducing ranking errors for classifier-based effort-aware defect prediction. We recommend using EA-Z with imbalanced ensemble learning.
URI: https://bura.brunel.ac.uk/handle/2438/29008
ISBN: 979-8-4007-1701-7 (ebk)
Other Identifiers: ORCiD: Yuchen Guo https://orcid.org/0000-0003-2756-9216
ORCiD: Martin Shepperd https://orcid.org/0000-0003-1874-6145
ORCiD: Ning Li https://orcid.org/0000-0001-7394-0640
Appears in Collections:Dept of Computer Science Research Papers

Files in This Item:
File Description SizeFormat 
FullText.pdfCopyright © 2024 Copyright held by the owner/author(s). Publication rights licensed to ACM. ACM ISBN 979-8-4007-1701-7/24/06 https://doi.org/10.1145/3661167.3661195 This is the author's version of the work. It is posted here for your personal use. Not for redistribution. The definitive Version of Record was published in EASE '24: Proceedings of the 28th International Conference on Evaluation and Assessment in Software Engineering, https://doi.org/10.1145/3661167 (see: https://www.acm.org/publications/policies/publication-rights-and-licensing-policy).Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from Permissions@acm.org.731.9 kBAdobe PDFView/Open


Items in BURA are protected by copyright, with all rights reserved, unless otherwise indicated.