PlaceLab “Intensive Activity” Test Dataset (Dataset PLIA2)

Please read the PlaceLab Data overview document before continuing.

Overview of the PLIA2 Dataset

The top-level directory (2006-03-24) referred to here can be found (temporarily) in this location: http://www.media.mit.edu/~intille/PLIA2/ . We hope to find a more permanent home for this soon, but you can check back here for the new location.

This sample PlaceLab dataset was recorded on Friday March 24, 2006 from 10AM to 2PM with a volunteer from our research team who was familiar with the PlaceLab, but not a creator of the core technical infrastructure.

The researcher was asked to perform a set of common household activities during the four-hour period using a set of instructions. Activities included the following: preparing a recipe, doing a load of dishes, cleaning the kitchen, doing laundry, making the bed, and light cleaning around the apartment. The volunteer determined the sequence, pace, and concurrency of these activities and also integrated additional household tasks. Our intent was to have a short test dataset of a manageable size that could be easily placed on the web without concerns about anonymity. We wanted this test dataset, however, to show a variety of activity types and activate as many sensors as possible, but in a natural way. In addition to the activities above, the researcher searches for items, uses appliances, talks on the phone, answers email, and performs other everyday tasks. The researcher five mobile accelerometers (one on each limb and one on the hip) and a Polar M32 wireless heart rate monitor. The researcher carried an SMT 5600 mobile phone that ran experience sampling software that beeped and presented a set of questions about her activities.

The mobile accelerometer sensors are called MITes and are described in this publication:

E. Munguia Tapia, S.S. Intille, L. Lopez, and K. Larson, "The design of a portable kit of wireless sensors for naturalistic data collection" in Proceedings of PERVASIVE 2006. Berlin Heidelberg: Springer-Verlag, 2006, to appear. [Abstract] [PDF]

 

The dataset includes four hours of partially (and soon to be fully) annotated video. The annotation was done using custom annotation software written by Randy Rockinson and Leevar Williams of MIT House_n. This software (called HandLense) is available for researchers to use to study this dataset. [Overview of HandLense and executable]

The annotations include descriptors for body posture, type of activity, location, and social context. The annotation ontology is described in more detail later.

Purpose

This dataset is intended primarily as a sample that researchers can explore in order to familiarize themselves with the type of data the PlaceLab is capable of generating. Researchers seriously interested in using PlaceLab datasets, including those from non-researcher volunteers staying in the facility for extended time periods, should contact us. The PlaceLab can be reconfigured to collect other types of sensor data as well.

Researchers who peruse or use this dataset are encouraged to contact House_ researchers and let them know. We may ask those who use the facility to write a letter of support so that we can pursue funding that would permit us to make larger (and more fully annotated) datasets available over time.

Images from this document or raw data from this dataset is not intended for use in news articles or other publications without the explicit consent of a PlaceLab researcher.

Ongoing Improvements 

Since the release of our first test dataset, we have made extensive improvements to the PlaceLab infrastructure, including:

We shortly plan to:

Directory Structure

Size: 8.19 GB total (primarily video and audio data)

The data can be (temporarily) found here: http://www.media.mit.edu/~intille/PLIA2/.

NOTE: We are working on the detailed specs for the PLIA2 dataset, which will be posted shortly. Annotation files will also be posted shortly.  Check back here regularly and send us an email if you are interested in possibly using it.  Use of the HandLense tool should provide a good overview, however. For details on some sensor types (although saved in a different format), see the PLIA1 dataset.