Access count of this item: 7

Files in This Item:
File Description SizeFormat 
s10846-019-01052-8.pdf2.12 MBAdobe PDFView/Open
Title: First-person Video Analysis for Evaluating Skill Level in the Humanitude Tender-Care Technique
Authors: Nakazawa, Atsushi
Mitsuzumi, Yu
Watanabe, Yuki
Kurazume, Ryo
Yoshikawa, Sakiko
Honda, Miwako
Author's alias: 中澤, 篤志
三鼓, 悠
倉爪, 亮
吉川, 左紀子
本田, 美和子
Keywords: Dementia
Deep neural network (DNN)
Skill evaluation
Wearable system
Computer vision
First person video
Issue Date: 4-Jul-2019
Publisher: Springer Science and Business Media LLC
Journal title: Journal of Intelligent & Robotic Systems
Abstract: In this paper, we describe a wearable first-person video (FPV) analysis system for evaluating the skill levels of caregivers. This is a part of our project that aims to quantize and analyze the tender-care technique known as Humanitude by using wearable sensing and AI technology devices. Using our system, caregivers can evaluate and elevate their care levels by themselves. From the FPVs of care sessions taken by wearable cameras worn by caregivers, we obtained the 3D facial distance, pose and eye-contact states between caregivers and receivers by using facial landmark detection and deep neural network (DNN)-based eye contact detection. We applied statistical analysis to these features and developed algorithms that provide scores for tender-care skill. In experiments, we first evaluated the performance of our DNN-based eye contact detection by using eye contact datasets prepared from YouTube videos and FPVs that assume conversational scenes. We then performed skill evaluations by using Humanitude training scenes involving three novice caregivers, two Humanitude experts and seven middle-level students. The results showed that our eye contact detection outperformed existing methods and that our skill evaluations can estimate the care skill levels.
Description: 優しさを伝える介護技術の習熟度をAIで評価する手法を開発 --画像認識で熟練者と初心者の違いを見つける--. 京都大学プレスリリース. 2019-07-11.
Rights: © The Author(s) 2019. This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (, which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.
DOI(Published Version): 10.1007/s10846-019-01052-8
Related Link:
Appears in Collections:Journal Articles

Show full item record

Export to RefWorks

Export Format: 

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.