题名

利用穿載式傳感訊號辨識營建施工人員作業行為之機器學習模型

并列篇名

MACHINE LEARNING MODEL FOR IDENTIFYING CONSTRUCTION WORKERS' OPERATIONS BASED ON WEARABLE SENSORS

DOI

10.6652/JoCICHE.202112_33(8).0004

作者

曾仁杰(Ren-Jye Dzeng);劉家喻(Chia-Yu Liu);薛憲徽(Hsien-Hui Hsueh)

关键词

傳感器 ; 作業辨識 ; 生產力 ; 生產效率 ; sensor ; operation identification ; activity ; productivity

期刊名称

中國土木水利工程學刊

卷期/出版年月

33卷8期(2021 / 12 / 01)

页次

629 - 639

内容语文

繁體中文

中文摘要

營建生產力之改善有賴準確、有效之生產效率評估。然而,傳統效率評估受限於觀測及分析所需之人力成本,多有侷限。本研究基於傳感器數據,利用LSTM發展機器學習模型進行人員作業類別辨識。招募共55名受測者,於實驗室模擬進行綁紮鋼筋、砌磚、貼磁磚、搬運斗車作業,以及休息。實驗結果顯示,若目的僅為判斷受測者是否有在工作,則辨識準確率介於96.67%~99.14%。若目的為辨識大項目作業類別,則準確率介於96.51%~100%,實際應用於工地是可行的。唯若目的為精準辨識細項作業,則僅斗車搬運及休息有極高的準確率,其餘三項作業屬中等準確率,尚不足以應用於實際工地。

英文摘要

Improvement of construction productivity depends on accurate and effective productivity measurement. Conventional productivity measurement approach has limited uses due to its demand for high level of labor effort. This research develops a machine learning model based on LSTM, which identifies construction worker's operations. An experiment involving 55 recruited subjects is conducted to allow the model to learn from collected data, and to test the accuracy for typical activities including rebar assembly, brick laying, wheel barrow moving, and resting. The result shows that the model has great identification accuracy in terms of determining whether subjects are working or resting (96.67% ~ 99.14%), performing upper-limb operations, lower-limb, or static operations (96.51% ~ 100%). However, when the objective is to identify detail operations, the accuracies are only good for identifying wheel-barrow moving and resting. The accuracies are considered not good enough to be used in the field for identify the rest of operations.

主题分类 工程學 > 土木與建築工程
工程學 > 水利工程
工程學 > 市政與環境工程
参考文献
  1. 楊廷曄,薛憲徽,曾仁杰(2020)。營建生產作業行為自動辨識系統。中國土木水利工程學刊,32(1),75-90。
    連結:
  2. Chaudhry, R.,Ofli, F.,Kurillo, G.,Bajcsy, R.,Vidal, R.(2013).Bio-inspired dynamic 3D discriminative skeletal features for human action recognition.Proc. IEEE Conference on Computer Vision and Pattern Recognition Workshops
  3. Chen, C.,Jafari, R.,Kehtarnavaz, N.(2016).A real-time human action recognition system using depth and inertial sensor fusion.IEEE J. of Sensors,16,773-781.
  4. Dzeng, R. J.,Fang, Y. C.,Cheng, I. C.(2014).A feasibility study of using smartphone built-in accelerometers to detect fall portents.Automation in Construction,38,74-86.
  5. Evangelidis, G.,Singh, G.,Horaud, R.(2014).Skeletal quads: Human action recognition using joint quadruples.Proc. International Conference on Pattern Recognition
  6. Fang, Y. C.,Dzeng, R. J.(2017).Accelerometer-based fall-portent detection algorithm for construction tiling operation.Automation in Construction,84,214-230.
  7. Godfrey, A.,Bourke, A. K.,Ólaighin, G. M.,van de Ven, P.,Nelson, J.(2011).Activity classification using a single chest mounted tri-axial accelerometer.Medical Engineering & Physics,33(9),1127-1135.
  8. Graves, A.,Schmidhuber, J.(2005).Framewise phoneme classification with bidirectional LSTM and other neural network architectures.Neural Network,18(5-6),602-610.
  9. Hochreiter, S.,Schmidhuber, J.(1997).Long Short-term Memory.Neural Computation,9,1735-1780.
  10. Hussein, M. E.,Torki1, M.,Gowayyed, M. A.,El-Saban, M.(2013).Human cction recognition using a temporal hierarchy of covariance cescriptors on 3D joint locations.23th International Joint Conference on Artificial Intelligence
  11. Jeong, D. U.,Do, K. H.,Chung, W. Y.(2008).Implementation of the wireless activity monitoring system using accelerometer and fuzzy classifier.Int. J. of Information Systems for Logistics and Management,3(2),115-120.
  12. Karantonis, D. M.,Narayanan, M. R.,Mathie, M.,Lovell, N. H.,Celler, B. G.(2006).Implementation of a real-time human movement classifier using a triaxial accelerometer for ambulatory monitoring.IEEE Transactions on Information Technology in Biomedicine,10(1),156-167.
  13. Lee, J. G.,Kim, M. S.,Hwang, T. M.,Kang, S. J.(2016).A mobile robot which can follow and lead human by detecting user location and behavior with wearable devices.Proc. of IEEE International Conference on Consumer Electronics
  14. Lee, J.,Kim, J.(2016).Energy-efficient real-time human activity recognition on smart mobile devices.Mobile Information Systems,2016,2316757.
  15. Lee, S. W.,Mase, K.(2002).Activity and location recognition using wearable sensors.IEEE Pervasive Computing,1(3),24-32.
  16. Lin, T.,Horne, B. G.,Giles, C. L.(1998).How embedded memory in recurrent neural network architectures helps learning long-term temporal dependencies.Neural Networks,11,861-868.
  17. Luštrek, Mitja,Kaluža, B.(2008).Fall detection and activity recognition with machine learning.Informatica,33,205-212.
  18. Lv, F.,Nevatia, R.(2006).Recognition and segmentation of 3-d human action using hmm and multi-class adaboost.Proc. of the European Conference on Computer Vision
  19. Mathie, M. J.,Coster, A. C.,Lovell, N. H.(2004).Accelerometry: Providing an integrated, practical method for long-term, ambulatory monitoring of human movement.Physiological Measurement,25(2),R1-R20.
  20. Noor, I.(1998).Measuring construction labour productivity by daily visits.Proceedings of the 42nd annual meeting of AACE,Cincinnati, Ohio, USA:
  21. Parkka, J.,Ermes, M.,Korpipaa, P.,Mantyjarvi, J.,Peltola, J.,Korhonen, I.(2006).Activity classification using realistic data from wearable sensors.IEEE Transactions on Information Technology in Biomedicine,10(1),119-128.
  22. Powers, David M. W.(2011).Evaluation: From Precision, Recall and F-Measure to ROC, Informedness, Markedness & Correlation.Journal of Machine Learning Technologies,2(1),37-63.
  23. Radosavljević, M.,Malcolm, R.,Horner, W.(2002).The evidence of complex variability in construction labour productivity.Construction Management and Economics,20(1),3-12.
  24. Sheikh, Y.,Sheikh, M.,Shah, M.(2005).Exploring the space of a human action.Proc. IEEE International Conference on Computer Vision
  25. Shen, C.,Chen, Y.,Yang, G.(2016).On motion-sensor behavior analysis for human-activity recognition via smartphones.Proc. of IEEE International Conference on Identity, Security and Behavior Analysis
  26. Singhal, G., Introduction to LSTM Units in RNN (2020). https://www.pluralsight.com/guides/introduction-to-lstmunits-in-rnn .
  27. Sullivan, S.,Reville, B.,Taylor, A. M.(2009).Stochastic particle acceleration in the lobes of giant radio galaxiesm.Monthly Notices of the Royal Astronomical Society,400(1),248-257.
  28. Wang, J.,Liu, Z.,Wu, Y.,Yuan, J.(2012).Mining actionlet ensemble for action recognition with depth cameras.Proc. of IEEE Conference on Computer Vision and Pattern Recognition
  29. Winch, G.,Carr, B.(2001).Benchmarking on-site productivity in France and the UK: a CALIBRE approach.Construction Management and Economics,19,577-590.
  30. Yang, X.,Tian, Y. L.(2012).Eigenjoints-based action recognition using naïve- bayes-nearest-neighbor.IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops
  31. Zhu, Y.,Chen, W.,Guo, G.(2013).Fusing spatiotemporal features and joints for 3d action recognition.Proc. of IEEE Conference on Computer Vision and Pattern Recognition Workshops
  32. 朱晏呈(2019).臺灣師範大學資訊工程學系.
  33. 李宗翰(2007)。國立台北大學資訊管理研究所。
  34. 林倪鋒(2014)。國立臺灣師範大學。
  35. 張瀞文(2016)。國立交通大學土木工程學系。
  36. 陳韋成,丁肇隆,張瑞益(2016)。營建工地安全系統之工地安全帽及背心偵測。資訊、科技與社會學報,65-77。
  37. 劉福勳(2004)。工地效率評估法在營建業之應用實例。現代營建,307,61-68。
  38. 蔡承錕(2006)。國立台灣藝術大學多媒體動畫藝術研究所。
  39. 蕭琮輝(2009)。國立台灣大學土木工程學系。
  40. 賴恒輝(2009)。朝陽科技大學營建工程系。
  41. 羅貴杰(2011)。國立高雄第一科技大學營建工程所。