Please use this identifier to cite or link to this item: http://bura.brunel.ac.uk/handle/2438/5006
Full metadata record
DC FieldValueLanguage
dc.contributor.authorShepperd, M-
dc.date.accessioned2011-04-12T13:03:56Z-
dc.date.available2011-04-12T13:03:56Z-
dc.date.issued2011-
dc.identifier.urihttp://bura.brunel.ac.uk/handle/2438/5006-
dc.description.abstractBACKGROUND: Prediction e.g. of project cost is an important concern in software engineering. PROBLEM: Although many empirical validations of software engineering prediction systems have been published, no one approach dominates and sense-making of conflicting empirical results is proving challenging. METHOD: We propose a new approach to evaluating competing prediction systems based upon an unbiased statistic (Standardised Accuracy), analysis of results relative to the baseline technique of guessing and calculation of effect sizes. RESULTS: Two empirical studies are revisited and the published results are shown to be misleading when re-analysed using our new approach. CONCLUSION: Biased statistics such as MMRE are deprecated. By contrast our approach leads to valid results. Such steps will greatly assist in performing future meta-analyses.en_US
dc.language.isoenen_US
dc.subjectSoftware engineeringen_US
dc.subjectSoftware project managementen_US
dc.subjectForecastingen_US
dc.subjectEffort predictionen_US
dc.subjectAccuracyen_US
dc.subjectEmpirical evaluationen_US
dc.titleNew ideas and emerging research: evaluating prediction system accuracyen_US
dc.typeResearch Paperen_US
Appears in Collections:Computer Science
Dept of Computer Science Research Papers
Software Engineering (B-SERC)

Files in This Item:
File Description SizeFormat 
Fulltext.pdf257.64 kBAdobe PDFView/Open


Items in BURA are protected by copyright, with all rights reserved, unless otherwise indicated.