1. Information about the Paper


Kwapisz, Jennifer R., Gary M. Weiss, and Samuel A. Moore. “Activity recognition using cell phone accelerometers.” ACM SigKDD Explorations Newsletter 12, no. 2 (2011): 74-82.


Mobile devices are becoming increasingly sophisticated and the latest generation of smart cell phones now incorporates many diverse and powerful sensors. These sensors include GPS sensors, vision sensors (i.e., cameras), audio sensors (i.e., microphones), light sensors, temperature sensors, direction sensors (i.e., magnetic compasses), and acceleration sensors (i.e., accelerometers). The availability of these sensors in mass-marketed communication devices creates exciting new opportunities for data mining and data mining applications. In this paper we describe and evaluate a system that uses phone-based accelerometers to perform activity recognition, a task which involves identifying the physical activity a user is performing. To implement our system we collected labeled accelerometer data from twenty-nine users as they performed daily activities such as walking, jogging, climbing stairs, sitting, and standing, and then aggregated this time series data into examples that summarize the user activity over 10- second intervals. We then used the resulting training data to induce a predictive model for activity recognition. This work is significant because the activity recognition model permits us to gain useful knowledge about the habits of millions of users passively—just by having them carry cell phones in their pockets. Our work has a wide range of applications, including automatic customization of the mobile device’s behavior based upon a user’s activity (e.g., sending calls directly to voicemail if a user is jogging) and generating a daily/weekly activity profile to determine if a user (perhaps an obese child) is performing a healthy amount of exercise.


2. My Review of the Paper


The authors of this paper have presented a good experiment model of activity recognition using accelerometer sensor. The authors of this paper have contributed several improvements from previous activity recognition works. These contributions are:

  1. It was using only single device, which is mass marketed phone instead of research only device. Therefore, it is easier and more practical to use in a research.
  2. It has more users for data collection than previous related works.



This work is not new. However, this work is still useful to provide foundation on future implementation and research on activity recognition using mobile sensors, including but not only accelerometers.


In my opinion, the authors of this paper failed to mention:

  1. Do the android phones used in this experiment have accelerometer at the same accuracy level? And since Android phones typically do not have same version of OS, do these concerns can affect the process and result of data collection.
  2. Why use supervised learning instead of natural activity recognition?


  1. Does “simply” adding more users to the data collection will “significantly” makes more “statistically reliable” conclusion from the collected data than previous works with less users?
  2. There are some implementation of this idea already mass marketed, for example: http://www.pcmag.com/slideshow/story/292474/the-25-best-fitness-apps
  3. There has been improvement in implementation. Currently fitness apps can even count how many steps you take and how long you have been walking or running using the accelerometer. Even though it needs you to tap a button to let the apps know when to start counting.
  4. I have an idea of implementing activity recognition for sleep time, in which the phone will behave accordingly. And there is already an app for this. Sleep Time by Azumio is an alarm clock that monitors and analyzes your sleep cycles to wake you up in the lightest sleep phase, allowing you to wake feeling rested and relaxed. Utilizing the iPhones accelerometer, Sleep Time by Azumio, senses your subtle movements throughout the night and graphs your sleep cycles. http://www.azumio.com/apps/sleep-time/
  5. This work uses single device position, in users’ front pants leg pocket. In 2013, there is new paper by Pekka Siirtola, Juha R¨oning, entitled: Ready-to-Use Activity Recognition for Smartphones. This paper improves the methodology of the previous work by showing an accuracy of activity recognition with an idea of user-and-body-independent location of the accelerometer sensor. It has successfully detected activities from several positions of the mobile phone accelerometer: in the trousers’ pocket, inside jacket, backpack, ear, table, and brachium.
  6. In 2012, Simple and Complex Activity Recognition Through Smart Phones, by Stefan Dernbach team, has published work to recognize more complex activity like cooking, cleaning, medication, sweeping, watering plants, and washing hands using smartphone.
  7. In 2012, Real Time Activity Recognition Using a Cell Phone’s Accelerometer and Wi-Fi, published by Enrique A. GARCIA and Ramón BRENA.
  8. In 2014, Theresia Ratih Dewi Saputri et al, published: User-Independent Activity Recognition via Three-Stage GA-Based Feature Selection (http://www.hindawi.com/journals/ijdsn/2014/706287/), which proposed a platform to detect subject-independent approach on activity recognition. It is based on the fact that each person has different pattern for each activity.