Rank-GCN for Robust Action Recognition
- Authors
- Lee, Haetsal; Park, Unsang; Kim, Ig-Jae; Cho, Junghyun
- Issue Date
- 2022-08
- Publisher
- IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
- Citation
- IEEE ACCESS, v.10, pp.91739 - 91749
- Abstract
- We present a robust skeleton-based action recognition method with graph convolutional network (GCN) that uses the new adjacency matrix, called Rank-GCN. In Rank-GCN, the biggest change from previous approaches is how the adjacency matrix is generated to accumulate features from neighboring nodes by re-defining "adjacency." The new adjacency matrix, which we call the rank adjacency matrix, is generated by ranking all the nodes according to metrics including the Euclidean distance from the nodes of interest, whereas the previous GCNs methods used only 1-hop neighboring nodes to construct adjacency. By adopting the rank adjacency matrix, we find not only performance improvements but also robustness against swapping, location shifting and dropping of certain nodes. The fact that the human-made rank adjacency matrix wins against the deep-learning-based matrix, implies that there are still some parts that need touch of humans. We expect our Rank-GCN can make performance improvements especially when the predicted human joints are less accurate and unstable.
- Keywords
- Three-dimensional displays; Spatiotemporal phenomena; Robustness; Feature extraction; Convolutional neural networks; Action recognition; graph convolutional network; dynamic convolutional network
- ISSN
- 2169-3536
- URI
- https://pubs.kist.re.kr/handle/201004/114784
- DOI
- 10.1109/ACCESS.2022.3202164
- Appears in Collections:
- KIST Article > 2022
- Files in This Item:
There are no files associated with this item.
- Export
- RIS (EndNote)
- XLS (Excel)
- XML
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.