Case Study: Ann Arbor Hands-On Museum’s Personalized Interactive Exhibits


The Hands-On Museum is a children’s STEaM (science, technology, engineering, art, math) museum based in Ann Arbor, Michigan, USA. As a non-profit organization, the Museum relies on field trip ticket sales as a significant portion of operating income. Museum educators were eager to offer an enhanced field trip experience for school children, one that would show quantitative data related to the learning experience and the science standards that schools and teachers are held to. The resulting data helps teachers tie their museum field trip visits to curriculum standards to ensure that their time (and their budget) is spent meaningfully.


The Museum wanted to accomplish two goals: encourage school field trips by providing teachers with quantitative data and provide a richer experience for children by slowing them down long enough to interact meaningfully with the exhibit and learn something more. To accomplish both goals, the museum would need to track which students were at which exhibits and what they did there.


TorranceLearning proposed an iterative project that combined RFID technology (future iterations will use beacon technology as a more reliable means of tracking highly mobile children), tablets, and xAPI to meet all these needs. In the first iteration of this project, the museum used RFID tags embedded into nametag lanyards to passively log students into each exhibit. As the children come into range of a wall-mounted antenna, a tablet computer mounted nearby greets them by name and engages them in a short series of questions and explorations with the exhibit.

Data from each student activity is sent back to the LRS (Learning Record Store) immediately after the student(s) complete the interaction, or after they “log out” by leaving the area (as determined by the antenna). Teachers and museum staff can then access a dashboard showing an activity stream and some simple data visualizations by student and by exhibit. A simple search function allows teachers to search for specific text strings found in either the xAPI statements or in the text responses typed in by students.


When a student—or group of students—approaches the exhibit, the antenna picks up on their name badges and the tablet opens up the appropriate grade level content. One of the advantages of the xAPI over SCORM is that it can record the same activity data for multiple learners at once. Each time a question is answered or a response given on the tablet, an xAPI statement is formed.

The data is sent to a LearnShare LRS for storage. A separate dashboard page requests these statements from the LRS to create the nearly-real-time report. A key benefit in this application is that the xAPI statements easily enable a human-readable activity stream. While this is interesting, in any sort of quantity it soon becomes overwhelming and considerably less useful. JavaScript widgets create simple graphs to quickly and concisely display activity by exhibit and by student, but there’s still more data available that could be used to create whatever meaningful visualizations might be required in the future.

In SCORM, a user must generally be logged into a system to record their individual progress and interaction. But the xAPI allows us to record the activity of anonymous users in any location, a choice that could be made due to privacy concerns or just to make it easier to gather anonymous data without a complex system of authentication that can slow down user interaction – an especially important consideration in a setting with potentially impatient children.

When tracking interactions, it’s very easy to get caught in question and answer SCORM-like quiz questions. The xAPI allows the Museum and teachers to track interactions that push the boundaries a bit more: free-form text, multi-part constructions and so on. In future phases, the xAPI will allow the recording of objective data from the exhibits themselves.

Tools Needed