TY - JOUR
T1 - Learning the generative principles of a symbol system from limited examples
AU - Yuan, Lei
AU - Xiang, Violet
AU - Crandall, David
AU - Smith, Linda
PY - 2020/7/1
Y1 - 2020/7/1
N2 - The processes and mechanisms of human learning are central to inquiries in a number of fields including psychology, cognitive science, development, education, and artificial intelligence. Arguments, debates, and controversies linger over the questions of human learning with one of the most contentious being whether simple associative processes could explain human children's prodigious learning, and in doing so, could lead to artificial intelligence that parallels human learning. One phenomenon at the center of these debates concerns a form of far generalization, sometimes referred to as “generative learning”, because the learner's behavior seems to reflect more than co-occurrences among specifically experienced instances and to be based on principles through which new instances may be generated. In two experimental studies (N = 148) of preschool children's learning of how multi-digit number names map to their written forms and in a computational modeling experiment using a deep learning neural network, we show that data sets with a suite of inter-correlated imperfect predictive components yield far and systematic generalizations that accord with generative principles and do so despite limited examples and exceptions in the training data. Implications for human cognition, cognitive development, education, and machine learning are discussed.
AB - The processes and mechanisms of human learning are central to inquiries in a number of fields including psychology, cognitive science, development, education, and artificial intelligence. Arguments, debates, and controversies linger over the questions of human learning with one of the most contentious being whether simple associative processes could explain human children's prodigious learning, and in doing so, could lead to artificial intelligence that parallels human learning. One phenomenon at the center of these debates concerns a form of far generalization, sometimes referred to as “generative learning”, because the learner's behavior seems to reflect more than co-occurrences among specifically experienced instances and to be based on principles through which new instances may be generated. In two experimental studies (N = 148) of preschool children's learning of how multi-digit number names map to their written forms and in a computational modeling experiment using a deep learning neural network, we show that data sets with a suite of inter-correlated imperfect predictive components yield far and systematic generalizations that accord with generative principles and do so despite limited examples and exceptions in the training data. Implications for human cognition, cognitive development, education, and machine learning are discussed.
KW - Associative learning
KW - Deep learning
KW - Education
KW - Generative learning
KW - Statistical learning
KW - Symbol systems
UR - http://www.scopus.com/inward/record.url?scp=85081010913&partnerID=8YFLogxK
U2 - 10.1016/j.cognition.2020.104243
DO - 10.1016/j.cognition.2020.104243
M3 - Article
AN - SCOPUS:85081010913
VL - 200
JO - Cognition
JF - Cognition
SN - 0010-0277
M1 - 104243
ER -