[ad_1]
Imagine that you live in the rainforests of Southeast Asia, you’re a pint-sized primate with enormous eyes that are roughly the same size as your brain, and you look a little like Gizmo from the movie, “Gremlins.” You’re a tarsier — a nocturnal animal whose giant eyes provide you with exceptional visual sensitivity, enabling a predatory advantage. A new virtual reality software, Tarsier Goggles, developed at Dartmouth College, simulates a tarsier’s vision and illustrates the adaptive advantage of this animal’s oversized eyes. Both the virtual reality build and the team’s findings published recently in Evolution: Education and Outreach are available for free online.
Tarsier Goggles was developed by Samuel Gochman ’18, while he was a student at Dartmouth and Nathaniel J. Dominy, the Charles Hansen Professor of Anthropology at Dartmouth, who studies the evolution of primate sensory systems, in collaboration with the Dartmouth Applied Learning and Innovation (DALI) Lab, where students design and build technology.
Gochman approached the DALI Lab with a problem: how could he change the human perception of our world by experiencing the tarsier’s unique ocular adaptations. Through an iterative process, the DALI team explored different design solutions upon which Gochman and the team determined that a virtual reality experience would be best, as it is not only immersive but could also be used as a teaching tool in a classroom setting.
The open-access software, Tarsier Goggles, features three virtual learning environments — “Matrix,” “Labyrinth” and “Bornean Rainforest,” which simulate how a tarsier’s vision is different from a human’s in terms of acuity, color vision and brightness. Bornean tarsiers have protanopia, a form of red-green colorblindness. In the virtual Bornean Rainforest, users can move through the forest, leaping and clinging to trees in “a dark, maze-like space that is practically opaque under human visual conditions but navigable as a tarsier, demonstrating the advantages of tarsier visual sensitivity,” as described by the authors.
“Most ninth- and 10th-grade students in the U.S. learn about optics and natural selection, but the two topics are usually treated in isolation,” says Dominy, who served as one of the co-authors. “The tarsier is an effective means of unifying both concepts. You have to understand optical principles to understand why natural selection would favor such enormous eyes in such a tiny predator.”
At Dartmouth, Gochman focused in biological anthropology and human-centered design, and this project was one of the ways he applied these research interests. “I realized that most students’ learning of natural selection was limited to diagrams, slideshows and models,” says Gochman, who served as the lead author of the study. “Virtual reality offers an immersive experience for understanding some of the properties of the tarsier’s vision, as a result of its adaptations. Tarsier Goggles is a science education tool that engages students in hands-on scientific concepts in physics, perceptual science and biology,” he adds.
As part of the study, Gochman demonstrated Tarsier Goggles at two on-campus events at Dartmouth, an anthropological society meeting and to a class of sixth-graders visiting the Vermont Institute of Natural Science in Quechee, Vt. He also demonstrated the technology to high school students at Kimball Union Academy in Meriden, N.H., where students in science and anthropology classes watched a brief video on tarsiers’ foraging behavior followed by the opportunity to try out this virtual reality technology for five minutes each. The students then completed a brief post-survey with open-ended questions, which was part of Gochman’s formal assessment of the virtual reality tool.
“The Tarsier Goggles project engaged my students first-hand in a learning experience, which could not have been achieved through any other medium,” explains Marilyn Morano Lord ’95, MALS ’97, an anthropology and world history teacher at Kimball Union Academy, who also served as one of the co-authors of the paper.
Tarsier Goggles was built in Unity3D with SteamVR for the HTC VivePro, and was coded in C#. The Virtual Reality Toolkit was used to create functionalities such as teleportation. For many of the visual effects, Unity’s built-in post processing stack was utilized, and the assets were built in Maya. All the visual assets and experience was coded from scratch by the DALI team based on the lab’s collaborative, human-centered design approach.
Tarsier Goggles illustrates the possibilities for how virtual reality can be applied to science education by providing students with a fun, interactive way to explore complex concepts.
Story Source:
Materials provided by Dartmouth College. Note: Content may be edited for style and length.
[ad_2]