You are currently viewing the summary.
View Full TextLog in to view the full text
AAAS login provides access to Science for AAAS members, and access to other journals in the Science family to users who have purchased individual subscriptions.
More options
Download and print this article for your personal scholarly, research, and educational use.
Buy a single issue of Science for just $15 USD.
Summary
For an animal to successfully feed, mate, and avoid danger, its brain must integrate incoming information from many sensory modalities, combine the information with previously stored knowledge about the world, and then send appropriate output commands to the muscles. The information in this process is highly spatial in nature, but it is not anchored to any one coordinate reference frame. For example, sensory data from a fingertip tell the animal about a point in space, but exactly which point in space depends on the position of the finger relative to the wrist and arm it is attached to, as well as on the actual location of the animal in the world. Similarly, for the information on the retina, the point depends on depth of field, position of the eye within the socket, position of the head relative to the body, and location of the animal in the world. To integrate this highly spatial information, the brain needs to transform between coordinate systems. On page 584 of this issue, Mimica et al. (1) demonstrate how the posterior parietal cortex (PPC) in the brains of rats represents different aspects of the animal's posture, that is, the relative position of different parts of its body, implying that these posture cells could be one of the building blocks for coordinate transformation in the brain.
This is an article distributed under the terms of the Science Journals Default License.