File Download

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)
Related Researcher

백승렬

Baek, Seungryul
UNIST VISION AND LEARNING LAB.
Read More

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

3DMesh-GAR: 3D Human Body Mesh-Based Method for Group Activity Recognition

Author(s)
Saqlain, MuhammadKim, DongukCha, JunukLee, ChanghwaLee, SeongyeongBaek, Seungryul
Issued Date
2022-02
DOI
10.3390/s22041464
URI
https://scholarworks.unist.ac.kr/handle/201301/58151
Fulltext
https://www.mdpi.com/1424-8220/22/4/1464
Citation
SENSORS, v.22, no.4, pp.1464
Abstract
Group activity recognition is a prime research topic in video understanding and has many practical applications, such as crowd behavior monitoring, video surveillance, etc. To understand the multi-person/group action, the model should not only identify the individual person's action in the context but also describe their collective activity. A lot of previous works adopt skeleton-based approaches with graph convolutional networks for group activity recognition. However, these approaches are subject to limitation in scalability, robustness, and interoperability. In this paper, we propose 3DMesh-GAR, a novel approach to 3D human body Mesh-based Group Activity Recognition, which relies on a body center heatmap, camera map, and mesh parameter map instead of the complex and noisy 3D skeleton of each person of the input frames. We adopt a 3D mesh creation method, which is conceptually simple, single-stage, and bounding box free, and is able to handle highly occluded and multi-person scenes without any additional computational cost. We implement 3DMesh-GAR on a standard group activity dataset: the Collective Activity Dataset, and achieve state-of-the-art performance for group activity recognition.
Publisher
MDPI
ISSN
1424-8220
Keyword (Author)
3D human activity recognitionhuman body mesh estimationfeature extractiondeep learningvideo understanding

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.