Publications / 2013 Proceedings of the 30th ISARC, Montréal, Canada

Vision-Based Action Recognition in the Internal Construction Site Using Interactions between Worker Actions and Construction Objects

J. Y. Kim, C. H. Caldas
Pages 661-668 (2013 Proceedings of the 30th ISARC, Montréal, Canada, ISBN 978-1-62993-294-1, ISSN 2413-5844)
Abstract:

This paper presents a novel action recognition method for observing human workers using interactions between actions and related objects on an internal construction site. This method can be used to measure work rates for labour productivity monitoring. This monitoring is critical because the performance of a construction project is significantly impacted by labour productivity. However, construction sites are generally crowded with a large number of workers and objects. Such congestion disrupts the accurate, automatic recognition of construction workers’ actions. This congestion is one reason that existing automatic action recognition studies of construction areas mainly focus on workers’ actions themselves. However, the crowded conditions mean that sites could offer a great deal of clues that could be used for automatic action recognition. According to psychological studies, interactions clearly take place between human actions and related objects, such as between hammering and a hammer. Humans use these interactions to recognize actions or objects more accurately. On the construction site, workers, materials, tools, and equipment are carefully planned out ahead of actual construction. The categories of workers and objects are pre-defined and, as noted, specific interactions define relations between worker actions and objects. In this paper, the interactions are limited to human workers and their hand-held objects. Action recognition results can be combined with hand-held object information to improve recognition accuracy. With the limited interactions, experiments in this paper show a significant improvement in action recognition. This paper describes the utilization of these interactions to improve construction action recognition accuracy based on human skeleton data and 2D color video from Microsoft KINECT sensor.

Keywords: Action Recognition, Work Sampling, Activity Analysis, KINECT, Skeleton, Human-Object Interaction