158
Views
11
CrossRef citations to date
0
Altmetric
Original Articles

How we learn about things we don't already understand

&
Pages 343-369
Received 01 Feb 2005
Accepted 01 May 2005
Published online: 20 Feb 2007
 

The computation-as-cognition metaphor requires that all cognitive objects are constructed from a fixed set of basic primitives; prominent models of cognition and perception try to provide that fixed set. Despite this effort, however, there are no extant computational models that can actually generate complex concepts and processes from simple and generic basic sets, and there are good reasons to wonder whether such models may be forthcoming. We suggest that one can have the benefits of computationalism without a commitment to fixed feature sets, by postulating processes that slowly develop special-purpose feature languages, from which knowledge is constructed. This provides an alternative to the fixed-model conception without radical anti-representationlism. Substantial evidence suggests that such feature development adaptation actually occurs in the perceptual learning that accompanies category learning. Given the existence of robust methods for novel feature creation, the assumption of a fixed basis set of primitives as psychologically necessary is at best premature. Methods of primitive construction include: (a) perceptual sensitization to physical stimuli; (b) unitization and differentiation of existing (non-psychological) stimulus elements into novel psychological primitives, guided by the current set of features; and (c) the intelligent selection of novel inputs, which in turn guides the automatic construction of new primitive concepts. Modelling the grounding of concepts as sensitivity to physical properties reframes the question of concept construction from the generation of an appropriate composition of sensations, to the tuning of detectors to appropriate circumstances.

Reprints and Permissions

Please note: We are unable to provide a copy of the article, please see our help page How do I view content?

To request a reprint or commercial or derivative permissions for this article, please click on the relevant link below.

Permission can also be obtained via Rightslink. For more information please visit our Permissions help page.