[2504.17696] Hierarchical and Multimodal Data for Daily Activity Understanding
About this article
Abstract page for arXiv paper 2504.17696: Hierarchical and Multimodal Data for Daily Activity Understanding
Computer Science > Computer Vision and Pattern Recognition arXiv:2504.17696 (cs) [Submitted on 24 Apr 2025 (v1), last revised 26 Mar 2026 (this version, v4)] Title:Hierarchical and Multimodal Data for Daily Activity Understanding Authors:Ghazal Kaviani, Yavuz Yarici, Seulgi Kim, Mohit Prabhushankar, Ghassan AlRegib, Mashhour Solh, Ameya Patil View a PDF of the paper titled Hierarchical and Multimodal Data for Daily Activity Understanding, by Ghazal Kaviani and 6 other authors View PDF HTML (experimental) Abstract:Daily Activity Recordings for Artificial Intelligence (DARai, pronounced "Dahr-ree") is a multimodal, hierarchically annotated dataset constructed to understand human activities in real-world settings. DARai consists of continuous scripted and unscripted recordings of 50 participants in 10 different environments, totaling over 200 hours of data from 20 sensors including multiple camera views, depth and radar sensors, wearable inertial measurement units (IMUs), electromyography (EMG), insole pressure sensors, biomonitor sensors, and gaze tracker. To capture the complexity in human activities, DARai is annotated at three levels of hierarchy: (i) high-level activities (L1) that are independent tasks, (ii) lower-level actions (L2) that are patterns shared between activities, and (iii) fine-grained procedures (L3) that detail the exact execution steps for actions. The dataset annotations and recordings are designed so that 22.7% of L2 actions are shared between L1 acti...