Skip to main content

Research Repository

Advanced Search

Learning deep features for kNN-based human activity recognition.

Sani, Sadiq; Wiratunga, Nirmalie; Massie, Stewart

Authors

Sadiq Sani



Contributors

Antonio A. Sanchez-Ruiz
Editor

Anders Kofod-Petersen
Editor

Abstract

A CBR approach to Human Activity Recognition (HAR) uses the kNN algorithm to classify sensor data into different activity classes. Different feature representation approaches have been proposed for sensor data for the purpose of HAR. These include shallow features, which can either be hand-crafted from the time and frequency domains, or the coefficients of frequency transformations. Alternatively, deep features can be extracted using deep learning approaches. These different representation approaches have been compared in previous works without a consistent best approach being identified. In this paper, we explore the question of which representation approach is best for kNN. Accordingly, we compare 5 different feature representation approaches (ranging from shallow to deep) on accelerometer data collected from two body locations, wrist and thigh. Results show deep features to produce the best results for kNN, compared to both hand-crafted and frequency transform, by a margin of up to 6.5% on the wrist and over 2.2% on the thigh. In addition, kNN produces very good results with as little as a single epoch of training for the deep features.

Citation

SANI, S., WIRATUNGA, N. and MASSIE, S. 2017. Learning deep features for kNN-based human activity recognition. In Sanchez-Ruiz, A.A. and Kofod-Petersen, A. (eds.) Workshop proceedings of the 25th International conference on case-based reasoning (ICCBR 2017), 26-29 June 2017, Trondheim, Norway. CEUR workshop proceedings, 2028. Aachen: CEUR-WS [online], session 2: case-based reasoning and deep learning workshop (CBRDL-2017), pages 95-103. Available from: http://ceur-ws.org/Vol-2028/paper9.pdf

Conference Name 25th International conference on case-based reasoning (ICCBR 2017)
Conference Location Trondheim, Norway
Start Date Jun 26, 2017
End Date Jun 29, 2017
Acceptance Date May 25, 2017
Online Publication Date Jun 26, 2017
Publication Date Dec 18, 2017
Deposit Date Sep 4, 2017
Publicly Available Date Sep 4, 2017
Print ISSN 1613-0073
Publisher CEUR Workshop Proceedings
Pages 95-103
Series Title CEUR workshop proceedings
Series Number 2028
Series ISSN 1613-0073
Keywords Human activity recognition; Feature representation; Deep learning
Public URL http://hdl.handle.net/10059/2489
Publisher URL http://ceur-ws.org/Vol-2028/paper9.pdf

Files




You might also like



Downloadable Citations