Smart City Gnosys

Smart city article details

Title Comparing Cnn And Human Crafted Features For Human Activity Recognition
ID_Doc 15088
Authors Cruciani F.; Vafeiadis A.; Nugent C.; Cleland I.; McCullagh P.; Votis K.; Giakoumis D.; Tzovaras D.; Chen L.; Hamzaoui R.
Year 2019
Published Proceedings - 2019 IEEE SmartWorld, Ubiquitous Intelligence and Computing, Advanced and Trusted Computing, Scalable Computing and Communications, Internet of People and Smart City Innovation, SmartWorld/UIC/ATC/SCALCOM/IOP/SCI 2019
DOI http://dx.doi.org/10.1109/SmartWorld-UIC-ATC-SCALCOM-IOP-SCI.2019.00190
Abstract Deep learning techniques such as Convolutional Neural Networks (CNNs) have shown good results in activity recognition. One of the advantages of using these methods resides in their ability to generate features automatically. This ability greatly simplifies the task of feature extraction that usually requires domain specific knowledge, especially when using big data where data driven approaches can lead to anti-patterns. Despite the advantage of this approach, very little work has been undertaken on analyzing the quality of extracted features, and more specifically on how model architecture and parameters affect the ability of those features to separate activity classes in the final feature space. This work focuses on identifying the optimal parameters for recognition of simple activities applying this approach on both signals from inertial and audio sensors. The paper provides the following contributions: (i) a comparison of automatically extracted CNN features with gold standard Human Crafted Features (HCF) is given, (ii) a comprehensive analysis on how architecture and model parameters affect separation of target classes in the feature space. Results are evaluated using publicly available datasets. In particular, we achieved a 93.38% F-Score on the UCI-HAR dataset, using 1D CNNs with 3 convolutional layers and 32 kernel size, and a 90.5% F-Score on the DCASE 2017 development dataset, simplified for three classes (indoor, outdoor and vehicle), using 2D CNNs with 2 convolutional layers and a 2x2 kernel size. © 2019 IEEE.
Author Keywords Convolutional Neural Networks; Deep Learning; Free-living; Human Activity Recognition


Similar Articles


Id Similarity Authors Title Published
59700 View0.88Sivakumar K.; Perumal T.; Yaakob R.; Marlisah E.Unobstructive Human Activity Recognition: Probabilistic Feature Extraction With Optimized Convolutional Neural Network For ClassificationAIP Conference Proceedings, 2816, 1 (2024)
28976 View0.864Imran H.A.; Latif U.Hharnet: Taking Inspiration From Inception And Dense Networks For Human Activity Recognition Using Inertial SensorsHONET 2020 - IEEE 17th International Conference on Smart Communities: Improving Quality of Life using ICT, IoT and AI (2020)
16779 View0.863Moshiri P.F.; Nabati M.; Shahbazian R.; Ghorashi S.A.Csi-Based Human Activity Recognition Using Convolutional Neural NetworksICCKE 2021 - 11th International Conference on Computer Engineering and Knowledge (2021)
14988 View0.858Gomaa W.Comparative Analysis Of Different Approaches To Human Activity Recognition Based On Accelerometer SignalsStudies in Big Data, 77 (2021)
3893 View0.856Cruciani F.; Sun C.; Zhang S.; Nugent C.; Li C.; Song S.; Cheng C.; Cleland I.; McCullagh P.A Public Domain Dataset For Human Activity Recognition In Free-Living ConditionsProceedings - 2019 IEEE SmartWorld, Ubiquitous Intelligence and Computing, Advanced and Trusted Computing, Scalable Computing and Communications, Internet of People and Smart City Innovation, SmartWorld/UIC/ATC/SCALCOM/IOP/SCI 2019 (2019)
1826 View0.856Turetta C.; Demrozi F.; Pravadelli G.A Freely Available System For Human Activity Recognition Based On A Low-Cost Body Area NetworkProceedings - 2022 IEEE 46th Annual Computers, Software, and Applications Conference, COMPSAC 2022 (2022)
17990 View0.855Mizuno M.; Hasegawa T.Deep Metric Learning For Sensor-Based Human Activity RecognitionACM International Conference Proceeding Series (2019)