Item request has been placed!
×
Item request cannot be made.
×
Processing Request
Emergence of analogy from relation learning.
Item request has been placed!
×
Item request cannot be made.
×
Processing Request
- Author(s): Lu H;Lu H;Lu H;Lu H; Wu YN; Wu YN; Holyoak KJ; Holyoak KJ
- Source:
Proceedings of the National Academy of Sciences of the United States of America [Proc Natl Acad Sci U S A] 2019 Mar 05; Vol. 116 (10), pp. 4176-4181. Date of Electronic Publication: 2019 Feb 15.
- Publication Type:
Journal Article; Research Support, Non-U.S. Gov't; Research Support, U.S. Gov't, Non-P.H.S.
- Language:
English
- Additional Information
- Source:
Publisher: National Academy of Sciences Country of Publication: United States NLM ID: 7505876 Publication Model: Print-Electronic Cited Medium: Internet ISSN: 1091-6490 (Electronic) Linking ISSN: 00278424 NLM ISO Abbreviation: Proc Natl Acad Sci U S A Subsets: PubMed not MEDLINE; MEDLINE
- Publication Information:
Original Publication: Washington, DC : National Academy of Sciences
- Abstract:
By middle childhood, humans are able to learn abstract semantic relations (e.g., antonym, synonym, category membership) and use them to reason by analogy. A deep theoretical challenge is to show how such abstract relations can arise from nonrelational inputs, thereby providing key elements of a protosymbolic representation system. We have developed a computational model that exploits the potential synergy between deep learning from "big data" (to create semantic features for individual words) and supervised learning from "small data" (to create representations of semantic relations between words). Given as inputs labeled pairs of lexical representations extracted by deep learning, the model creates augmented representations by remapping features according to the rank of differences between values for the two words in each pair. These augmented representations aid in coping with the feature alignment problem (e.g., matching those features that make "love-hate" an antonym with the different features that make "rich-poor" an antonym). The model extracts weight distributions that are used to estimate the probabilities that new word pairs instantiate each relation, capturing the pattern of human typicality judgments for a broad range of abstract semantic relations. A measure of relational similarity can be derived and used to solve simple verbal analogies with human-level accuracy. Because each acquired relation has a modular representation, basic symbolic operations are enabled (notably, the converse of any learned relation can be formed without additional training). Abstract semantic relations can be induced by bootstrapping from nonrelational inputs, thereby enabling relational generalization and analogical reasoning.
Competing Interests: The authors declare no conflict of interest.
(Copyright © 2019 the Author(s). Published by PNAS.)
- References:
Front Syst Neurosci. 2008 Nov 24;2:4. (PMID: 19104670)
Front Psychol. 2013 Nov 20;4:857. (PMID: 24312068)
Behav Brain Sci. 2008 Apr;31(2):109-30; discussion 130-178. (PMID: 18479531)
Front Psychol. 2018 Jul 24;9:1235. (PMID: 30140242)
Neuropsychologia. 2008;46(7):2020-32. (PMID: 18355881)
Psychol Rev. 2012 Jul;119(3):617-48. (PMID: 22775500)
Dev Sci. 2007 May;10(3):288-97. (PMID: 17444970)
Psychol Rev. 2018 Apr;125(3):293-328. (PMID: 29733663)
J Exp Child Psychol. 2006 Jul;94(3):249-73. (PMID: 16620867)
Psychol Rev. 2003 Apr;110(2):220-64. (PMID: 12747523)
Psychol Rev. 2008 Jan;115(1):1-43. (PMID: 18211183)
Cogn Psychol. 2014 Jun;71:27-54. (PMID: 24531498)
Cereb Cortex. 2010 Jan;20(1):70-6. (PMID: 19383937)
Behav Brain Sci. 1998 Dec;21(6):803-31; discussion 831-64. (PMID: 10191879)
Cogn Sci. 2017 May;41 Suppl 5:1062-1092. (PMID: 27859582)
Science. 2011 Mar 11;331(6022):1279-85. (PMID: 21393536)
J Cogn Neurosci. 2004 Mar;16(2):260-71. (PMID: 15068596)
- Contributed Indexing:
Keywords: analogy; generalization; learning; semantic relations; word embeddings
- Publication Date:
Date Created: 20190217 Date Completed: 20200312 Latest Revision: 20200312
- Publication Date:
20240829
- Accession Number:
PMC6410800
- Accession Number:
10.1073/pnas.1814779116
- Accession Number:
30770443
No Comments.