Activities of Daily Living Monitoring via a WearableCamera: Toward Real-World Applications
| dc.contributor.author | Cartas Ayala, Alejandro | |
| dc.contributor.author | Radeva, Petia | |
| dc.contributor.author | Dimiccoli, Mariella | |
| dc.date.accessioned | 2021-09-02T11:05:27Z | |
| dc.date.available | 2021-09-02T11:05:27Z | |
| dc.date.issued | 2020-04-27 | |
| dc.date.updated | 2021-09-02T11:05:28Z | |
| dc.description.abstract | Activity recognition from wearable photo-cameras is crucial for lifestyle characterization and health monitoring. However, to enable its wide-spreading use in real-world applications, a high level of generalization needs to be ensured on unseen users. Currently, state-of-the-art methods have been tested only on relatively small datasets consisting of data collected by a few users that are partially seen during training. In this paper, we built a new egocentric dataset acquired by 15 people through a wearable photo-camera and used it to test the generalization capabilities of several state-of-the-art methods for egocentric activity recognition on unseen users and daily image sequences. In addition, we propose several variants to state-of-the-art deep learning architectures, and we show that it is possible to achieve 79.87% accuracy on users unseen during training. Furthermore, to show that the proposed dataset and approach can be useful in real-world applications, where data can be acquired by different wearable cameras and labeled data are scarcely available, we employed a domain adaptation strategy on two egocentric activity recognition benchmark datasets. These experiments show that the model learned with our dataset, can easily be transferred to other domains with a very small amount of labeled data. Taken together, those results show that activity recognition from wearable photo-cameras is mature enough to be tested in real-world applications. | |
| dc.format.extent | 20 p. | |
| dc.format.mimetype | application/pdf | |
| dc.identifier.idgrec | 708307 | |
| dc.identifier.issn | 2169-3536 | |
| dc.identifier.uri | https://hdl.handle.net/2445/179836 | |
| dc.language.iso | eng | |
| dc.publisher | Institute of Electrical and Electronics Engineers (IEEE) | |
| dc.relation.isformatof | Reproducció del document publicat a: https://doi.org/10.1109/ACCESS.2020.2990333 | |
| dc.relation.ispartof | IEEE Access, 2020, vol. 8, p. 77344-77363 | |
| dc.relation.uri | https://doi.org/10.1109/ACCESS.2020.2990333 | |
| dc.rights | cc-by (c) Cartas Ayala, Alejandro et al., 2020 | |
| dc.rights.accessRights | info:eu-repo/semantics/openAccess | |
| dc.rights.uri | https://creativecommons.org/licenses/by/4.0/ | |
| dc.source | Articles publicats en revistes (Matemàtiques i Informàtica) | |
| dc.subject.classification | Anàlisi de conducta | |
| dc.subject.classification | Sistemes persona-màquina | |
| dc.subject.other | Behavioral assessment | |
| dc.subject.other | Human-machine systems | |
| dc.title | Activities of Daily Living Monitoring via a WearableCamera: Toward Real-World Applications | |
| dc.type | info:eu-repo/semantics/article | |
| dc.type | info:eu-repo/semantics/publishedVersion |
Fitxers
Paquet original
1 - 1 de 1