Please use this identifier to cite or link to this item:
http://bura.brunel.ac.uk/handle/2438/24204
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Byerly, A | - |
dc.contributor.author | Kalganova, T | - |
dc.date.accessioned | 2022-03-05T11:05:36Z | - |
dc.date.available | 2022-03-05T11:05:36Z | - |
dc.date.issued | 2023-01-07 | - |
dc.identifier | arXiv:2202.03238v1 | - |
dc.identifier | ORCID iDs: Adam Byerly https://orcid.org/0000-0002-9124-5008; Tatiana Kalganova https://orcid.org/0000-0003-4859-7152. | - |
dc.identifier | 144 | - |
dc.identifier.citation | Byerly, A. and Kalganova, T. (2023) 'Towards an Analytical Definition of Sufficient Data', SN Computer Science, 4 (2), 144, pp. 1 - 23. doi: 10.1007/s42979-022-01549-4. | en_US |
dc.identifier.uri | https://bura.brunel.ac.uk/handle/2438/24204 | - |
dc.description | The article available on this repository is an uncorrected preprint available at https://doi.org/10.48550/arXiv.2202.03238 . It has not been peer reviewed. | - |
dc.description.abstract | Copyright © 2022 The Author(s). We show that, for each of five datasets of increasing complexity, certain training samples are more informative of class membership than others. These samples can be identified a priori to training by analyzing their position in reduced dimensional space relative to the classes' centroids. Specifically, we demonstrate that samples nearer the classes' centroids are less informative than those that are furthest from it. For all five datasets, we show that there is no statistically significant difference between training on the entire training set and when excluding up to 2% of the data nearest to each class's centroid. | en_US |
dc.format.extent | 1 - 23 | - |
dc.format.medium | Electronic | - |
dc.language.iso | en_US | en_US |
dc.publisher | Springer Nature | en_US |
dc.rights | arXiv perpetual, non-exclusive license (https://arxiv.org/licenses/nonexclusive-distrib/1.0/): This license gives limited rights to arXiv to distribute the article, and also limits re-use of any type from other entities or individuals. Some authors require a license that is not listed here. As long as the desired license does not restrict arXiv’s license, authors can select the arXiv license and then indicate the desired license in the first page of the article. This is a requirement of some funders and governments. Metadata license: A Creative Commons CC0 1.0 Universal Public Domain Dedication (https://creativecommons.org/publicdomain/zero/1.0/) will apply to all metadata. | - |
dc.rights.uri | https://arxiv.org/licenses/nonexclusive-distrib/1.0/ | - |
dc.subject | data reduction | en_US |
dc.subject | dimensional reduction | en_US |
dc.subject | UMAP | - |
dc.subject | class separation | - |
dc.subject | dataset severability | - |
dc.title | Towards an Analytical Definition of Sufficient Data | en_US |
dc.type | Article | en_US |
dc.identifier.doi | https://doi.org/10.1007/s42979-022-01549-4 | - |
pubs.issue | 2 | - |
pubs.volume | 4 | - |
dc.identifier.eissn | 2331-8422 | - |
Appears in Collections: | Dept of Electronic and Electrical Engineering Research Papers |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
Preprint.pdf | arXiv perpetual, non-exclusive license (https://arxiv.org/licenses/nonexclusive-distrib/1.0/): This license gives limited rights to arXiv to distribute the article, and also limits re-use of any type from other entities or individuals. Some authors require a license that is not listed here. As long as the desired license does not restrict arXiv’s license, authors can select the arXiv license and then indicate the desired license in the first page of the article. This is a requirement of some funders and governments. Metadata license: A Creative Commons CC0 1.0 Universal Public Domain Dedication (https://creativecommons.org/publicdomain/zero/1.0/) will apply to all metadata. | 2.91 MB | Adobe PDF | View/Open |
Items in BURA are protected by copyright, with all rights reserved, unless otherwise indicated.