The Outer Spaces of Learning
October 14, 2009
This post was originally published on the Breakthrough Learning in a Digital Age blog.
In the recent Star Trek movie by J.J. Abrams we get to see what a 23rd century Vulcan school was like for young Spock. Hundreds of crater-like pods, with 360-degree projection systems, allow individual students to navigate through an encyclopedic array of topics fully described through visualizations and sound. These images, as science fiction images often do, illuminate more about our present anxieties and desires than they do about possible futures. Star Trek fan and media theorist Henry Jenkins points out these conflicting visions in his blog, “Confessions of an Aca/Fan“:
“In some ways, this is the future which many educators fear — one where they have been displaced by the machine. In other ways, it is the future we hope for—one where there are no limits placed on the potentials of individual learners to advance.”
As an interaction designer I can’t help but be fascinated by the design characteristics of these learning pods, particularly as a learning space and experience. Spatially, the pods enable a 360-degree view of virtual content, placing the student in the center of a whirl of information, visualized in formulas and images. Experientially, the intelligence (presumably artificial) that runs the displays also has a voice – quizzing the student on a wide array of subjects, from geometry to ethics. The effect is a high tech quiz or spelling bee, knowledge containable in a question and short answer. And in a reverse of the exploratory pleasures of the Internet, students merely provide the answers to questions asked by the computer—they don’t ask questions of their own. Perhaps we are glimpsing a Vulcan standardized test? While the delivery is individualized, as Jenkins points out, it’s also asocial, physically inactive (there’s no “learning by doing”), and if we take away the special effects, no more advanced than some of the drill-based learning methods we use today.
From the tricorder to the holodeck, Star Trek has, since its introduction in the 60s, had a tremendous influence on technology designers. Wah Ming Chang‘s communicator designs for the original TV series inspired Martin Cooper‘s mobile phone design research. Will the images of the 23rdcentury Vulcan school influence future school design? In terms of a spatial and experiential approach to learning, I actually think there are more interesting examples present now, in the 21st century. I’ll mention two of them, and why, from the perspective of interaction design, they promote not only cool new technologies and “special effects”, but also breakthroughs in the design of learning.
In SMALLab (Situated Multimedia Art Learning Lab), a mixed-reality learning environment developed by ASU’s David Birchfield, students also inhabit a projected world where diverse subjects can switch as quickly as the speed of light. However, this is where the similarity between SMALLab and Star Trek’s vision ends. Instead of surrounding us at eye-level, the projections for SMALLab are on the floor. The fact that images are beneath you immediately activates the space. Like the lines chalked on the sidewalk forming a hopscotch court, SMALLab beckons for ACTION. We want to step, run, jump, and crawl on the interactive images beneath us. And the space also encourages this – it uses playful interaction to illuminate how systems work – from concrete poetry to physics. By using our bodies, SMALLab engages our minds in making sense of these systems. In addition, it is a social space. SMALLab has a multi-user interface promoting collaboration and team problem solving. Students learn by coordinating their actions, helping each other, and making sense of how the systems underlying knowledge about a subject actually behave and respond.
Like SMALLab, Mannahatta: The Game (M:TG), a prototype developed by PETLab at Parsons, takes students out of a typical learning environment – in this case, onto the streets of New York City. A collaboration with Dr. Eric Sanderson’s Mannahatta Project at the Wildlife Conservation Society, M:TG uses geographically tagged data about the ecosystems present on Manhattan island prior to colonization in 1609 to create a game of exploration and discovery. Currently available for the iPhone platform, the game requires that players traverse the city in order to uncover and link ecosystem elements, rebuilding the webs that once existed, city block by city block. Using the iPhone as a kind of detector, players are notified when a needed species (mountain lion, poplar tree) or eco-link (“food for”, “shelter for”) is available nearby. M:TG is a game, but it is also a platform for experimenting with how augmented reality and geo-spatial databases can be brought together to form a situated space for learning. In other words, M:TG takes learning out of the classroom and into the world.
These two examples are pilot environments and prototypes. They aren’t yet common, and the implications of these spaces are still being discovered. Both share the element of play, collaboration and creativity as a strategy for activating learning. These environments clearly diverge from current classroom norms – and they also go a step beyond their science fiction counterparts – by designing playful and rich spaces and experiences as integral to 21st century learning.
SMALLab is one of the two dozen innovations that will be showcased at the Technology Playground demo space at the Breakthrough Learning in a Digital Age Forum on October 28.
Referenced:
Henry Jenkins, “Five Ways to Start a Conversation About the New Star Trek Film”. (blog post, May 12, 2009, accessed 9/28/09). http://henryjenkins.org/2009/05/five_ways_to_start_a_conversat.html
SMALLab, see: http://www.instituteofplay.com/node/169and http://ame2.asu.edu/projects/emlearning/smallab/smallab.php