Please use this identifier to cite or link to this item:
http://bura.brunel.ac.uk/handle/2438/5006
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Shepperd, M | - |
dc.date.accessioned | 2011-04-12T13:03:56Z | - |
dc.date.available | 2011-04-12T13:03:56Z | - |
dc.date.issued | 2011 | - |
dc.identifier.uri | http://bura.brunel.ac.uk/handle/2438/5006 | - |
dc.description.abstract | BACKGROUND: Prediction e.g. of project cost is an important concern in software engineering. PROBLEM: Although many empirical validations of software engineering prediction systems have been published, no one approach dominates and sense-making of conflicting empirical results is proving challenging. METHOD: We propose a new approach to evaluating competing prediction systems based upon an unbiased statistic (Standardised Accuracy), analysis of results relative to the baseline technique of guessing and calculation of effect sizes. RESULTS: Two empirical studies are revisited and the published results are shown to be misleading when re-analysed using our new approach. CONCLUSION: Biased statistics such as MMRE are deprecated. By contrast our approach leads to valid results. Such steps will greatly assist in performing future meta-analyses. | en_US |
dc.language.iso | en | en_US |
dc.subject | Software engineering | en_US |
dc.subject | Software project management | en_US |
dc.subject | Forecasting | en_US |
dc.subject | Effort prediction | en_US |
dc.subject | Accuracy | en_US |
dc.subject | Empirical evaluation | en_US |
dc.title | New ideas and emerging research: evaluating prediction system accuracy | en_US |
dc.type | Research Paper | en_US |
Appears in Collections: | Computer Science Dept of Computer Science Research Papers Software Engineering (B-SERC) |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
Fulltext.pdf | 257.64 kB | Adobe PDF | View/Open |
Items in BURA are protected by copyright, with all rights reserved, unless otherwise indicated.