You are currently viewing the abstract.
View Full TextLog in to view the full text
AAAS login provides access to Science for AAAS members, and access to other journals in the Science family to users who have purchased individual subscriptions.
Register for free to read this article
As a service to the community, this article is available for free. Existing users log in.
More options
Download and print this article for your personal scholarly, research, and educational use.
Buy a single issue of Science for just $15 USD.
Reading Dreams
How specific visual dream contents are represented by brain activity is unclear. Machine-learning–based analyses can decode the stimulus- and task-induced brain activity patterns that represent specific visual contents. Horikawa et al. (p. 639, published online 4 April) examined patterns of brain activity during dreaming and compared these to waking responses to visual stimuli. The findings suggest that the visual content of dreams is represented by the same neural substrate as observed during awake perception.
Abstract
Visual imagery during sleep has long been a topic of persistent speculation, but its private nature has hampered objective analysis. Here we present a neural decoding approach in which machine-learning models predict the contents of visual imagery during the sleep-onset period, given measured brain activity, by discovering links between human functional magnetic resonance imaging patterns and verbal reports with the assistance of lexical and image databases. Decoding models trained on stimulus-induced brain activity in visual cortical areas showed accurate classification, detection, and identification of contents. Our findings demonstrate that specific visual experience during sleep is represented by brain activity patterns shared by stimulus perception, providing a means to uncover subjective contents of dreaming using objective neural measurement.