Exploiting node-feature bipartite graph in graph convolutional networks

Yuli Jiang, Huaijia Lin, Ye Li, Yu Rong, Hong Cheng*, Xin Huang

*Corresponding author for this work

Research output: Contribution to journalJournal articlepeer-review

7 Citations (Scopus)


In recent years, Graph Convolutional Networks (GCNs), which extend convolutional neural networks to graph structure, have achieved great success on many graph learning tasks by fusing structure and feature information, such as node classification. However, the graph structure is constructed from real-world data and usually contains noise or redundancy. In addition, this structural information is based on manually defined relations and is not potentially optimal for downstream tasks. In this paper, we utilize the knowledge from node features to enhance the expressive power of GCN models in a plug-and-play fashion. Specifically, we build a node-feature bipartite graph and exploit the bipartite graph convolutional network to model node-feature relations. By aligning results from the original graph structure and node-feature relations, we can make a more accurate prediction for each node in an end-to-end manner. Extensive experiments demonstrate that the proposed model can extract knowledge from two branches and improve the performance of various GCN models on typical graph data sets and 3D point cloud data.

Original languageEnglish
Pages (from-to)409-423
Number of pages15
JournalInformation Sciences
Publication statusPublished - May 2023

Scopus Subject Areas

  • Theoretical Computer Science
  • Software
  • Control and Systems Engineering
  • Computer Science Applications
  • Information Systems and Management
  • Artificial Intelligence

User-Defined Keywords

  • Bipartite graph
  • Bipartite graph convolutional networks
  • Graph convolutional networks
  • Node classification
  • Semi-supervised learning


Dive into the research topics of 'Exploiting node-feature bipartite graph in graph convolutional networks'. Together they form a unique fingerprint.

Cite this