Please use this identifier to cite or link to this item:
http://bura.brunel.ac.uk/handle/2438/2134
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Gobet, F | - |
dc.contributor.author | Lane, PCR | - |
dc.coverage.spatial | 1 | en |
dc.date.accessioned | 2008-05-01T13:17:29Z | - |
dc.date.available | 2008-05-01T13:17:29Z | - |
dc.date.issued | 2004 | - |
dc.identifier.citation | Proceedings of the 26th Annual Meeting of the Cognitive Science Society (p. 3). Mahwah, NJ: Erlbaum. | en |
dc.identifier.uri | http://bura.brunel.ac.uk/handle/2438/2134 | - |
dc.description.abstract | CHREST (Chunk Hierarchy and REtrieval STructures) is a comprehensive, computational model of human learning and perception. It has been used to successfully simulate data in a variety of domains, including: the acquisition of syntactic categories, expert behaviour, concept formation, implicit learning, and the acquisition of multiple representations in physics for problem solving. The aim of this tutorial is to provide participants with an introduction to CHREST, how it can be used to model various phenomena, and the knowledge to carry out their own modelling experiments. | en |
dc.format.extent | 42733 bytes | - |
dc.format.mimetype | application/pdf | - |
dc.language.iso | en | - |
dc.publisher | Erlbaum | en |
dc.subject | CHREST | en |
dc.subject | Computer modelling | en |
dc.subject | Cognitive architecture | en |
dc.subject | Learning | en |
dc.subject | Syntax | en |
dc.subject | Expertise | en |
dc.subject | Retrieval structure | en |
dc.subject | Concept | en |
dc.subject | Implicit learning | en |
dc.subject | Physics | en |
dc.title | CHREST tutorial: Simulations of human learning | en |
dc.type | Research Paper | en |
Appears in Collections: | Psychology Dept of Life Sciences Research Papers |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
CogSci-2004-gobet-lane-CHREST-tutorial.pdf | 41.73 kB | Adobe PDF | View/Open |
Items in BURA are protected by copyright, with all rights reserved, unless otherwise indicated.