Abstract
Understanding the nature of meaning and its extensions (with metaphor as one typical kind) has been one core issue in figurative language study since Aristotle's time. This research takes a computational cognitive perspective to model metaphor based on the assumption that meaning is perceptual, embodied, and encyclopedic. We model word meaning representation for metaphor detection with embodiment information obtained from behavioral experiments. Our work is the first attempt to incorporate sensorimotor knowledge into neural networks for metaphor detection, and demonstrates superiority, consistency, and interpretability compared to peer systems based on two general datasets. In addition, with cross-sectional analysis of different feature schemas, our results suggest that metaphor, as a device of cognitive conceptualization, can be 'learned' from the perceptual and actional information independent of several more explicit levels of linguistic representation. The access to such knowledge allows us to probe further into word meaning mapping tendencies relevant to our conceptualization and reaction to the physical world.
Original language | English |
---|---|
Pages (from-to) | 1-29 |
Number of pages | 29 |
Journal | Natural Language Engineering |
DOIs | |
Publication status | E-pub ahead of print - 20 Sept 2023 |
Scopus Subject Areas
- Software
- Language and Linguistics
- Linguistics and Language
- Artificial Intelligence
User-Defined Keywords
- Deep learning
- Embodiment
- Knowledge incorporation
- Metaphor detection
- Sense modality