Searching 3D Motion Patterns of Vietnamese Traditional Dances

Tìm kiếm mẫu chuyển động 3D của múa truyền thống Việt Nam

  • Thi Thanh Van Le
  • Vu Ngoc Quang
  • Pham Thanh Huyen
  • Ma Thi Chau
  • Le Thanh Ha
Keywords: Motion search, motion recognition, similarity matching, dynamic time warping


Vietnam has many traditional dances in old theatres such as Xoan singing, “tuồng” or “chèo”. They all urgently
need to be preserved in digital formats, especially in 3D motion capture format for dances. In digital formats, they bring many values such as the ability to automatically classify and search for content of dances’ movement. In this paper, we propose a system for 3D movement search of Cheo dance ’s postures and gestures. The system applies sliding window technique, Dynamic Time Warping algorithm and a novel feature selection method named CheoAngle. Results show that the proposed system reach good scores in several metrics. We also compare CheoAngle with other feature selection methods for 3D movement and show that CheoAngle give the best results.

Author Biography

Thi Thanh Van Le

Van Le Thi Thanh
Human machine interaction laboratory
VNU University of Engineering and Technology
Hanoi, Vietnam
Huyen Pham Thanh
Faculty of Information technology
HaLong University
QuangNinh, Vietnam
Quang Vu Ngoc
Human machine interaction laboratory
VNU University of Engineering and Technology
Hanoi, Vietnam
Chau Ma Thi
Human machine interaction laboratory
VNU University of Engineering and Technology
Hanoi, Vietnam
Ha Le Thanh
Human machine interaction laboratory
VNU University of Engineering and Technology
Hanoi, Vietnam


Lê Ngọc Canh, Nghệ thuật múa Chèo. Nhà xuất bản Sân khấu, 2003.

Nhà hát tuồng Việt Nam, “Múa tuồng (vũ đạo),” 2020. [Online]. Available:

Nguyễn Bích Thủy, “Nghệ thuật trình diễn hát xoan phú thọ,” 2018. [Online]. Available:

Trần Thị Ngọc, Giáo trình múa Chèo. Trường đại học Sânkhấu và Điện ảnh Hà Nội, 1998.

HMI Laboratory, VNU University of Engineering and Technology, “3d dataset for the traditional cheo dance,” 2021, accessed: 2021-11-16. [Online]. Available:

B. Rennhak, T. Shiratori, S. Kudoh, P. Vinayavekhin, and K. Ikeuchi, “Detecting dance motion structure using body components and turning motions,” 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 2264–2269, 2010.

E. Protopapadakis, A. Grammatikopoulou, A. Doulamis, and N. Grammalidis, “Folk dance pattern recognition over depth images acquired via kinect sensor,” The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, vol. XLII-2/W3, pp. 587–593, 2017. [Online]. Available:

Z. Li, “Three-dimensional diffusion model in sports dance video human skeleton detection and extraction,” Advances in Mathematical Physics, vol. 2021, pp. 1–11, sep 2021. [Online]. Available:

X. Hu and N. Ahuja, “Unsupervised 3d pose estimation for hierarchical dance video recognition,” CoRR, vol. abs/2109.09166, 2021. [Online]. Available:

F. A. Damastuti, A. K. Nurindiyani, and D. Pramadihanto,“Gesture 3d modeling for traditional javanese dance,” in 2018 International Electronics Symposium on Knowledge Creation and Intelligent Computing (IES-KCIC), 2018, pp. 69–73.

J. Sutopo, M. K. A. Ghani, B. M. Aboobaider, and Zulhawati, “Dance gesture recognition using space component and effort component of laban movement analysis,” International Journal of Scientific & Technology Research, vol. 9, no. 2, 2020.

N. Jain, V. Bansal, D. Virmani, V. Gupta, L. Salas-Morera,and L. Garcia-Hernandez, “An enhanced deep convolutional neural network for classifying indian classical dance forms,” Applied Sciences, vol. 11, no. 14, 2021. [Online]. Available:

E. Protopapadakis, A. Voulodimos, A. Doulamis, S. Camarinopoulos, N. Doulamis, and G. Miaoulis, “Dance

pose identification from motion capture data: A comparison of classifiers,” Technologies, vol. 6, no. 1, 2018. [Online]. Available:

E. Keogh, S. Chu, D. Hart, and M. Pazzani, “Segmenting time series: A survey and novel approach,” in Data mining in time series databases. World Scientific, 2004, pp. 1–21.

A. S. . P. K. . H. Kaufmann, Motion Similarity Modeling A State of the Art Report, 8 2020.

G. Yusuke, T. Wataru, and N. Yoshihiko, “Classification of multi-class daily human motion using discriminative body parts and sentence descriptions,” Int. J. Comput. Vision, vol. 126, no. 5, p. 495–514, May 2018. [Online]. Available:

S. Li, J. Tingting, T. Yonghong, and H. Tiejun, “3d human skeleton data compression for action recognition,” in 2019 IEEE Visual Communications and Image Processing (VCIP), Dec 2019, pp. 1–4.

D. Berndt and J. Clifford, “Using dynamic time warping to find patterns in time series,” in KDD Workshop, 1994, pp. 359—-370.

E. Keogh and C. A. Ratanamahatana, “Exact indexing of dynamic time warping,” Knowledge and Information

Systems, vol. 7, no. 3, pp. 358–386, mar 2005. [Online]. Available:

M. Vlachos, D. Gunopulos, and G. Kollios, “Discovering similar multidimensional trajectories,” Proceedings 18th International Conference on Data Engineering, pp. 673–684, 2002.

M. Vlachos, M. Hadjieleftheriou, D. Gunopulos, and E. J. Keogh, “Indexing multi-dimensional time-series with support for multiple distance measures,” in KDD ’03, 2003.

H. Václav, “Robot kinematics,” 2005. [Online]. Available:

L. C. . M. T. Ozsu , Vincent Oria, “Robust and fast similarity search for moving object trajectories,” Proceedings of the ACM SIGMOD International Conference on Management of Data, pp. 491–502, 2005.

L. Allison, “Dynamic programming algorithm (dpa) for edit-distance,” Faculty of Information Technology (Clayton), Monash University, Australia, 1999. [Online]. Available:

L. V. I. and I. V., “Binary codes capable of correcting deletions, insertions and reversals,” SPhD, vol. 10, p. 707,

[Online]. Available:

G.-M. I. of Electronics and C. Science, “Longest commonsubsequences,” France, 1998. [Online]. Available:

J. K. T. Tang, H. Leung, T. Komura, and H. P. H. Shum, “Emulating human perception of motion similarity,” Comput. Animat. Virtual Worlds, vol. 19, no. 3–4, p. 211–221, sep 2008.