[ad_1]
Are you taking your time when feeding your pet? Fluffy and Fido are on to you — and they can tell when you are dawdling.
A new study from Northwestern University has found some of the clearest evidence yet that animals can judge time. By examining the brain’s medial entorhinal cortex, the researchers discovered a previously unknown set of neurons that turn on like a clock when an animal is waiting.
“Does your dog know that it took you twice as long to get its food as it took yesterday? There wasn’t a good answer for that before,” said Daniel Dombeck, who led the study. “This is one of the most convincing experiments to show that animals really do have an explicit representation of time in their brains when they are challenged to measure a time interval.”
The research was published online this week in the journal Nature Neuroscience. Dombeck is an associate professor of neurobiology in Northwestern’s Weinberg College of Arts and Sciences.
When planning the study, Dombeck’s team focused on the medial entorhinal cortex, an area located in the brain’s temporal lobe that is associated with memory and navigation. Because that part of the brain encodes spatial information in episodic memories, Dombeck hypothesized that the area could also be responsible for encoding time.
“Every memory is a bit different,” said James Heys, a postdoctoral fellow in Dombeck’s laboratory. “But there are two central features to all episodic memories: space and time. They always happen in a particular environment and are always structured in time.”
To test their hypothesis, Dombeck and Heys set up an experiment called the virtual “door stop” task. In the experiment, a mouse runs on a physical treadmill in a virtual reality environment. The mouse learns to run down a hallway to a door that is located about halfway down the track. After six seconds, the door opens, allowing the mouse to continue down the hallway to receive its reward.
After running several training sessions, researchers made the door invisible in the virtual reality scene. In the new scenario, the mouse still knew where the now-invisible “door” was located based on the floor’s changing textures. And it still waited six seconds at the “door” before abruptly racing down the track to collect its reward.
“The important point here is that the mouse doesn’t know when the door is open or closed because it’s invisible,” said Heys, the paper’s first author. “The only way he can solve this task efficiently is by using his brain’s internal sense of time.”
By using virtual reality, Dombeck and his team can neatly control potentially influencing factors, such as the sound of the door opening. “We wouldn’t be able to make the door completely invisible in a real environment,” Dombeck said. “The animal could touch it, hear it, smell it or sense it in some way. They wouldn’t have to judge time; they would just sense when the door opened. In virtual reality, we can take away all sensory cues.”
But Dombeck and his team did more than watch the mice complete the door stop task over and over again. They took the experiment one step further by imaging the mice’s brain activity. Using two-photon microscopy, which allows advanced, high-resolution imaging of the brain, Dombeck and Heys watched the mice’s neurons fire.
“As the animals run along the track and get to the invisible door, we see the cells firing that control spatial encoding,” Dombeck said. “Then, when the animal stops at the door, we see those cells turned off and a new set of cells turn on. This was a big surprise and a new discovery.”
Dombeck noted these “timing cells” did not fire during active running — only during rest. “Not only are the cells active during rest,” he said, “but they actually encode how much time the animal has been resting.”
The implication of the work expands well beyond your impatient pooch. Now that researchers have found these new time-encoding neurons, they can study how neurodegenerative diseases might affect this set of cells.
“Patients with Alzheimer’s disease notably forget when things happened in time,” Heys said. “Perhaps this is because they are losing some of the basic functions of the entorhinal cortex, which is one of the first brain regions affected by the disease.”
“So this could lead to new early-detection tests for Alzheimer’s,” Dombeck added. “We could start asking people to judge how much time has elapsed or ask them to navigate a virtual reality environment — essentially having a human do a ‘door stop’ task.”
This work was supported by the National Institutes of Health (award number 1R01MH101297), the McKnight Foundation and the Chicago Biomedical Consortium.
[ad_2]