PTL-HAR: A Framework for Few Shot Activity Recognition
DOI:
https://doi.org/10.63313/AJET.9026Keywords:
Few-Shot Learning, Transfer Learning, Human Activity Recognition, Wearable DevicesAbstract
With the popularity of wearable devices and IoT technology, sensor-based human activity recognition is valuable in fields like smart healthcare. However, traditional deep learning requires large labeled datasets and struggles with new categories, users, or environments where data is scarce.
To address this, we propose a Few-Shot Transfer Learning method for Human Activity Recognition (FTL-HAR). It leverages pre-trained models to transfer knowledge, enabling adaptation to new categories with minimal data. Experiments on public datasets (PAMAP2, OPPORTUNITY) under 1-shot and 5-shot settings show that FTL-HAR significantly outperforms traditional methods by effectively utilizing pre-trained features for rapid fine-tuning.
References
[1] Alzheimer’s patients’ daily life activities assistance, Neural Computing and Applications 35(2) (2022) 1777-1802.
[2] W.-Y. Cheng, A. Scotland, F. Lipsmeier, T. Kilchenmann, L. Jin, J. Schjodt-Eriksen, D. Wolf, Y.-P. Zhang-Schaerer, I.F. Garcia, J. Siebourg-Polster, J. Soto, L. Verselis, M. Martin-Facklam, F. Boess, M. Koller, M. Grundman, A. Monsch, R. Postuma, A. Ghosh, T. Kremer, K. Taylor, C. Czech, C. Gossens, M. Lindemann, Human Activity Recognition from Sensor-Based Large-Scale Continuous Monitoring of Parkinson’s Disease Patients, 2017 IEEE/ACM In-ternational Conference on Connected Health: Applications, Systems and Engineering Technologies (CHASE), 2017, pp. 249-250.
[3] M. Thukral, S.G. Dhekane, S.K. Hiremath, H. Haresamudram, T. Ploetz, Layout-Agnostic Human Activity Recognition in Smart Homes through Textual Descriptions Of Sensor Triggers (TDOST), Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiqui-tous Technologies 9(1) (2025) 1-38.
[4] L. Bibbò, R. Carotenuto, F. Della Corte, An Overview of Indoor Localization System for Human Activity Recognition (HAR) in Healthcare, Sensors 22(21) (2022).
[5] P.S. Patil, S. Shelke, S. Joldapke, V. Jumle, S. Chikhale, Review on Human Activity Recogni-tion for Military Restricted Areas, International Journal for Research in Applied Science and Engineering Technology 10(12) (2022) 603-606.
[6] M. Chen, Y. Li, X. Luo, W. Wang, L. Wang, W. Zhao, A Novel Human Activity Recognition Scheme for Smart Health Using Multilayer Extreme Learning Machine, IEEE Internet of Things Journal 6(2) (2019) 1410-1418.
[7] Y. Wang, S. Cang, H. Yu, A survey on wearable sensor modality centred human activity recognition in health care, Expert Systems with Applications 137 (2019) 167-190.
[8] L. Bao, S.S. Intille, Activity recognition from user-annotated acceleration data, Lect Notes Comput Sc 3001 (2004) 1-17.
[9] J.R. Kwapisz, G.M. Weiss, S.A. Moore, Activity recognition using cell phone accelerometers, ACM SIGKDD Explorations Newsletter 12(2) (2011) 74-82.
Downloads
Published
Issue
Section
License
Copyright (c) 2025 by author(s) and Erytis Publishing Limited.

This work is licensed under a Creative Commons Attribution 4.0 International License.













