Help | Advanced Search

Computer Science > Machine Learning

Title: theory of graph neural networks: representation and learning.

Abstract: Graph Neural Networks (GNNs), neural network architectures targeted to learning representations of graphs, have become a popular learning model for prediction tasks on nodes, graphs and configurations of points, with wide success in practice. This article summarizes a selection of the emerging theoretical results on approximation and learning properties of widely used message passing GNNs and higher-order GNNs, focusing on representation, generalization and extrapolation. Along the way, it summarizes mathematical connections.

Submission history

Access paper:.

  • Other Formats

References & Citations

  • Google Scholar
  • Semantic Scholar

BibTeX formatted citation

BibSonomy logo

Bibliographic and Citation Tools

Code, data and media associated with this article, recommenders and search tools.

  • Institution

arXivLabs: experimental projects with community collaborators

arXivLabs is a framework that allows collaborators to develop and share new arXiv features directly on our website.

Both individuals and organizations that work with arXivLabs have embraced and accepted our values of openness, community, excellence, and user data privacy. arXiv is committed to these values and only works with partners that adhere to them.

Have an idea for a project that will add value for arXiv's community? Learn more about arXivLabs .

Representation Learning and Reasoning with Graph Neural Networks

Duckietown

In machine learning, a system can effectively make predictions from raw data by learning representations, e.g., of objects in the world. Many types of data such as images, languages, molecules, or interactions can be viewed as a graph. For such data, researchers are increasingly harnessing the power of Graph Neural Networks (GNNs) , a structured framework for representation learning of graphs.

This project focuses on the theoretical foundations for analyzing the expressive power of GNNs, i.e., understanding and improving what kinds of things these networks can learn to predict, for two usages:

  • Representation

Representation of objects in the world helps us make predictions or clarifications about them. Deep-learning methods such as GNNs can represent and capture effective representations of data that can be modeled as graphs. This type of representation is used in a variety of applications. In the pharmaceutical industry, for example, GNNs can learn suitable representations of molecules to predict their properties and to help design new drugs. GNNs are also applied to obtain personalized recommendations for products, or to content provided by streaming media services.

After developing representations of objects, we look for ways a GNN can improve reasoning about these representations for machine learning and artificial intelligence (AI). Given a character's representation in a game, for example, you can try to predict how that character is going to behave. Similarly, if a machine-learning system is provided with representations of objects that interact, it can learn to reason about the representations in a series of steps.

What is often key to the success of a learning method is structure in the data. The structure we consider for the reasoning process is the structure of a procedure that can answer the reasoning question. Specifically, we show that many such tasks can be solved by dynamic programming, a general algorithmic setup that allows you to efficiently solve multi-stage problems.

By developing the theoretical foundations for reasoning about the expressive power of GNNs and expanding their representational capacity, we continue to pursue evolving and powerful architectures for machine learning with graphs.

Communities

If you would like to contact us about our work, please refer to our members below and reach out to one of the group leads directly.

Last updated Apr 24 '20

Jegelka-headshot

Stefanie Jegelka

Chapter1: Representation Learning

Liang zhao, emory university, [email protected] lingfei wu, pinterest, [email protected] peng cui, tsinghua university, [email protected] jian pei, duke university, [email protected].

In this chapter, we first describe what the representation learning is and why we need representation learning. Among the various ways of learning representations, this chapter focuses on deep learning methods: those that are formed by the composition of multiple non-linear transformations, with the goal of resulting in more abstract and ultimately more useful representations. We summarize the representation learning techniques in different domains, focusing on the unique challenges and models for different data types including images, natural languages, speech signals and networks. At last, we summarize this chapter and provide further reading on mutual information-based representation learning, which is a recently emerging representation technique via unsupervised learning.

  • Representation Learning: An Introduction
  • Representation Learning in Different Areas
  • Representation Learning for Image Processing
  • Representation Learning for Speech Recognition
  • Representation Learning for Natural Language Processing
  • Representation Learning for Networks

@incollection{GNNBook-ch1-zhao, author = "Zhao, Liang and Wu, Lingfei and Cui, Peng and Pei, Jian", editor = "Wu, Lingfei and Cui, Peng and Pei, Jian and Zhao, Liang", title = "Representation Learning", booktitle = "Graph Neural Networks: Foundations, Frontiers, and Applications", year = "2022", publisher = "Springer Singapore", address = "Singapore", pages = "3--15", }

L. Zhao, L. Wu, P. Cui, and J. Pei, “Representation learning,” in Graph Neural Networks: Foundations, Frontiers, and Applications, L. Wu, P. Cui, J. Pei, and L. Zhao, Eds. Singapore: Springer Singapore, 2022, pp. 3–15.

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • View all journals
  • Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts
  • Perspective
  • Published: 31 October 2022

Graph representation learning in biomedicine and healthcare

  • Michelle M. Li 1 , 2 ,
  • Kexin Huang 3 &
  • Marinka Zitnik   ORCID: orcid.org/0000-0001-8530-7228 2 , 4 , 5  

Nature Biomedical Engineering volume  6 ,  pages 1353–1369 ( 2022 ) Cite this article

16k Accesses

47 Citations

226 Altmetric

Metrics details

  • Health care
  • Machine learning
  • Molecular medicine
  • Network topology
  • Systems biology

Networks—or graphs—are universal descriptors of systems of interacting elements. In biomedicine and healthcare, they can represent, for example, molecular interactions, signalling pathways, disease co-morbidities or healthcare systems. In this Perspective, we posit that representation learning can realize principles of network medicine, discuss successes and current limitations of the use of representation learning on graphs in biomedicine and healthcare, and outline algorithmic strategies that leverage the topology of graphs to embed them into compact vectorial spaces. We argue that graph representation learning will keep pushing forward machine learning for biomedicine and healthcare applications, including the identification of genetic variants underlying complex traits, the disentanglement of single-cell behaviours and their effects on health, the assistance of patients in diagnosis and treatment, and the development of safe and effective medicines.

This is a preview of subscription content, access via your institution

Access options

Access Nature and 54 other Nature Portfolio journals

Get Nature+, our best-value online-access subscription

24,99 € / 30 days

cancel any time

Subscribe to this journal

Receive 12 digital issues and online access to articles

92,52 € per year

only 7,71 € per issue

Buy this article

  • Purchase on Springer Link
  • Instant access to full article PDF

Prices may be subject to local taxes which are calculated during checkout

graph neural network representation learning

Similar content being viewed by others

graph neural network representation learning

Graph neural networks

Gabriele Corso, Hannes Stark, … Regina Barzilay

graph neural network representation learning

Network modeling of patients' biomolecular profiles for clinical phenotype/outcome prediction

Jessica Gliozzo, Paolo Perlasca, … Giorgio Valentini

graph neural network representation learning

A knowledge graph to interpret clinical proteomics data

Alberto Santos, Ana R. Colaço, … Matthias Mann

Qiu, X. et al. Inferring causal gene regulatory networks from coupled single-cell expression dynamics using scribe. Cell Syst 10 , 265–274.e11 (2020).

Article   CAS   Google Scholar  

Nicholson, D. N. & Greene, C. S. Constructing knowledge graphs and their biomedical applications. Comput. Struct. Biotechnol. J. 18 , 1414–1428 (2020).

Article   Google Scholar  

Robinson, P. N. et al. The human phenotype ontology: a tool for annotating and analyzing human hereditary disease. Am. J. Hum. Genet. 83 , 610–615 (2008).

Schriml, L. M. et al. Disease ontology: a backbone for disease semantic integration. Nucleic Acids Res. 40 , D940–D946 (2012).

Hong, C. et al. Clinical knowledge extraction via sparse embedding regression (KESER) with multi-center large scale electronic health record data. npj Digital Med 4 , 151 (2021).

Gysi, D. M. et al. Network medicine framework for identifying drug-repurposing opportunities for COVID-19. Proc. Natl Acad. Sci. USA 118 , e2025581118 (2021).

Nelson, C. A., Butte, A. J. & Baranzini, S. E. Integrating biomedical research and electronic health records to create knowledge-based biologically meaningful machine-readable embeddings. Nat. Commun. 10 , 3045 (2019).

Chen, R. J. et al. Pathomic fusion: an integrated framework for fusing histopathology and genomic features for cancer diagnosis and prognosis. In IEEE Transactions on Medical Imaging Vol. 41, 757–770 (IEEE, 2022).

Callahan, T. J., Tripodi, I. J., Pielke-Lombardo, H. & Hunter, L. E. Knowledge-based biomedical data science. Annu. Rev. Biomed. Data Sci. 3 , 23–41 (2020).

Barabási, A.-L. Network medicine — from obesity to the “diseasome”. N. Engl. J. Med. 357 , 404–407 (2007).

Mungall, C. J. et al. The Monarch Initiative: an integrative data and analytic platform connecting phenotypes to genotypes across species. Nucleic Acids Res 45 , D712–D722 (2017).

Goh, K.-I. et al. The human disease network. Proc. Natl Acad. Sci. USA 104 , 8685–8690 (2007).

Barabási, A.-L., Gulbahce, N. & Loscalzo, J. Network medicine: a network-based approach to human disease. Nat. Rev. Genet. 12 , 56–68 (2011).

Hu, J. X., Thomas, C. E. & Brunak, S. Network biology concepts in complex disease comorbidities. Nat. Rev. Genet. 17 , 615–629 (2016).

Zitnik, M. et al. Evolution of resilience in protein interactomes across the tree of life. Proc. Natl Acad. Sci. USA 116 , 4426–4433 (2019).

Agrawal, M., Zitnik, M. & Leskovec, J. Large-scale analysis of disease pathways in the human interactome. In Pac. Symp. Biocomput. 23 , 111–122 (2018).

Google Scholar  

Camacho, D. M., Collins, K. M., Powers, R. K., Costello, J. C. & Collins, J. J. Next-generation machine learning for biological networks. Cell 173 , 1581–1592 (2018).

Zhang, Z., Cui, P. & Zhu, W. Deep learning on graphs: a survey. In IEEE Transactions on Knowledge and Data Engineering Vol. 34, 249–270 (IEEE, 2020).

Hamilton, W. L., Ying, R. & Leskovec, J. Representation learning on graphs: methods and applications. IEEE Data Eng. Bull 40 , 52–74 (2017).

Hamilton, W. L. in Synthesis Lectures on Artificial Intelligence and Machine Learning Vol. 14, 1–159 (Morgan and Claypool, 2020).

Wu, Z. et al. A comprehensive survey on graph neural networks. In IEEE Transactions on Neural Networks and Learning Systems Vol. 32, 4–24 (IEEE, 2020).

Chen, F., Wang, Y.-C., Wang, B. & Kuo, C.-C. J. Graph representation learning: a survey. In APSIPA Transactions on Signal and Information Processing Vol. 9, E15 (Cambridge Univ. Press, 2020).

Li, B. & Pi., D. Network representation learning: a systematic literature review. Neural Comput. Appl. 34 , 16647–16679 (2020).

Yue, X. et al. Graph embedding on biomedical networks: methods, applications and evaluations. Bioinformatics 36 , 1241–1251 (2020).

CAS   Google Scholar  

Dong, Y., Hu, Z., Wang, K., Sun, Y. & Tang, J. Heterogeneous network representation learning. In Proc. 29th International Joint Conference on Artificial Intelligence 4861–4867 (IJCAI, 2020).

Kazemi, S. M. et al. Representation learning for dynamic graphs: a survey. J Mach. Learn. Res. 21 , 1–73 (2020).

Zitnik, M. et al. Machine learning for integrating data in biology and medicine: principles, practice, and opportunities. Inf. Fusion 50 , 71–91 (2019).

Cowen, L., Ideker, T., Raphael, B. J. & Sharan, R. Network propagation: a universal amplifier of genetic associations. Nat. Rev. Genet. 18 , 551–562 (2017).

Blevins, A. S. & Bassett, D. S. in Handbook of the Mathematics of the Arts and Sciences (ed. Sriraman, B.) 2073–2095 (Springer, 2020).

Koutrouli, M., Karatzas, E., Paez-Espino, D. & Pavlopoulos, G. A. A guide to conquer the biological network era using graph theory. Front. Bioeng. Biotechnol. 8 , 34 (2020).

Liu, C. et al. Computational network biology: data, models, and applications. Phys. Rep. 846 , 1–66 (2020).

Rai, A., Shinde, P. & Jalan, S. Network spectra for drug-target identification in complex diseases: new guns against old foes. Appl. Netw. Sci. 3 , 51 (2018).

David, L., Thakkar, A., Mercado, R. & Engkvist, O. Molecular representations in AI-driven drug discovery: a review and practical guide. J. Cheminformatics 12 , 56 (2020).

Wieder, O. et al. A compact review of molecular property prediction with graph neural networks. Drug Discov. Today. Technol. 37 , 1–12 (2020).

Hetzel, L., Fischer, D. S., Günnemann, S. & Theis, F. J. Graph representation learning for single cell biology. Curr. Opin. Syst. Biol. 28 , 100347 (2021).

Jiménez-Luna, J., Grisoni, F. & Schneider, G. Drug discovery with explainable artificial intelligence. Nat. Mach. Intell. 2 , 573–584 (2020).

Sun, M. et al. Graph convolutional networks for computational drug development and discovery. Brief. Bioinform. 21 , 919–935 (2020).

Gaudelet, T. et al. Utilizing graph machine learning within drug discovery and development. Brief. Bioinform. 22 , bbab159 (2021).

MacLean, F. Knowledge graphs and their applications in drug discovery. Expert Opin. Drug Discov 16 , 1057–1069 (2021).

Zeng, X., Tu, X., Liu, Y., Fu, X. & Su, Y. Toward better drug discovery with knowledge graph. Curr. Opin. Struct. Biol. 72 , 114–126 (2022).

Ahmedt-Aristizabal, D., Armin, M. A., Denman, S., Fookes, C. & Petersson, L. A survey on graph-based deep learning for computational histopathology. Comput. Med. Imaging Graph. 95 , 102027 (2021).

Muzio, G., O’Bray, L. & Borgwardt, K. Biological network analysis with deep learning. Brief. Bioinform. 22 , 1515–1530 (2021).

Guo, M. et al. Analysis of disease comorbidity patterns in a large-scale China population. BMC Med . Genomics 12 , 177 (2019).

Le, D.-H. & Dang, V.-T. Ontology-based disease similarity network for disease gene prediction. Vietnam J. Comput. Sci. 3 , 197–205 (2016).

Menche, J. et al. Uncovering disease-disease relationships through the incomplete interactome. Science 347 , 1257601 (2015).

Sumathipala, M., Maiorino, E., Weiss, S. T. & Sharma, A. Network diffusion approach to predict lncrna disease associations using multi-type biological networks: Lion. Front. Physiol. 10 , 888 (2019).

Cheng, F., Kovács, I. A. & Barabási, A.-L. Network-based prediction of drug combinations. Nat. Commun. 10 , 1197 (2019).

Cheng, F. et al. A genome-wide positioning systems network algorithm for in silico drug repurposing. Nat. Commun. 10 , 3476 (2019).

Chen, Z.-H. et al. Prediction of drug–target interactions from multi-molecular network based on deep walk embedding model. Front. Bioeng. Biotechnol. 8 , 338 (2020).

Wong, L. et al. MIPDH: a novel computational model for predicting microRNA–mRNA interactions by DeepWalk on a heterogeneous network. ACS Omega 5 , 17022–17032 (2020).

Yang, K. et al. HerGePred: heterogeneous network embedding representation for disease gene prediction. IEEE J. Biomed. Health Inform. 23 , 1805–1815 (2018).

Geng, C. et al. iScore: a novel graph kernel-based function for scoring protein–protein docking models. Bioinformatics 36 , 112–121 (2020).

Veselkov, K. et al. HyperFoods: machine intelligent mapping of cancer-beating molecules in foods. Sci. Rep. 9 , 9237 (2019).

Zheng, A. & Casari, A. Distributed multi-task classification: a decentralized online learning approach. Mach. Learn. 107 , 727–747 (2018).

Perozzi, B., Al-Rfou, R. & Skiena, S. DeepWalk: online learning of social representations. In Proc. ACM SIGKDD Conference on Knowledge Discovery and Data Mining 701–710 (ACM, 2014).

Mikolov, T., Sutskever, I., Chen, K., Corrado, G. S. & Dean, J. Distributed representations of words and phrases and their compositionality. In Proc. 26th International Conference on Neural Information Processing Systems Vol. 2 (eds Burges, C. J. et al.) 3111–3119 (Curran Associates, 2013).

Grover, A. & Leskovec, J. Node2vec: scalable feature learning for networks. In Proc. ACM SIGKDD Conference on Knowledge Discovery and Data Mining 855–864 (ACM, 2016).

Tang, J. et al. LINE: Large-scale information network embedding. In Proc. ACM Web Conference 1067–1077 (ACM, 2015).

Dong, Y., Chawla, N. V. & Swami, A. metapath2vec: scalable representation learning for heterogeneous networks. In Proc. ACM SIGKDD Conference on Knowledge Discovery and Data Mining 135–144 (ACM, 2017).

Bordes, A., Usunier, N., García-Durán, A., Weston, J. & Yakhnenko, O. Translating embeddings for modeling multi-relational data. In Proc. 26th International Conference on Neural Information Processing Systems Vol. 2 (eds Burges, C. J. et al.) 2787–2795 (Curran Associates, 2013).

Nickel, M., Tresp, V. & Kriegel, H. A three-way model for collective learning on multi-relational data. In Proc. 28th International Conference on International Conference on Machine Learning 809–816 (PMLR, 2011).

Trouillon, T., Welbl, J., Riedel, S., Gaussier, É. & Bouchard, G. Complex embeddings for simple link prediction. In Proc. 33rd International Conference on International Conference on Machine Learning Vol. 48 (eds Balcan, M. F. & Weinberger, K. Q.) 2071–2080 (PMLR, 2016).

Sun, Z., Deng, Z., Nie, J. & Tang, J. RotatE: knowledge graph embedding by relational rotation in complex space. In International Conference on Learning Representations (2019).

Yang, B., Yih, W., He, X., Gao, J. & Deng, L. Embedding entities and relations for learning and inference in knowledge bases. In International Conference on Learning Representations (2015).

Gilmer, J., Schoenholz, S. S., Riley, P. F., Vinyals, O. & Dahl, G. E. Neural message passing for quantum chemistry. In Proc. 34th International Conference on Machine Learning Vol. 70, 1263–1272 (PMLR, 2017).

Kipf, T. N. & Welling, M. Semi-supervised classification with graph convolutional networks. In International Conference on Learning Representations (2017).

Xu, K., Hu, W., Leskovec, J. & Jegelka, S. How powerful are graph neural networks? In International Conference on Learning Representations (2019).

Duvenaud, D. et al. Convolutional networks on graphs for learning molecular fingerprints. In Proc. 28th International Conference on Neural Information Processing Systems Vol. 2 (eds Cortes, C. et al.) 2224–2232 (ACM, 2015).

Vinyals, O., Bengio, S. & Kudlur, M. Order matters: sequence to sequence for sets. In International Conference on Learning Representations (2016).

Defferrard, M., Bresson, X. & Vandergheynst, P. Convolutional neural networks on graphs with fast localized spectral filtering. In Proc. International Conference on Neural Information Processing Systems (eds Lee, D. et al.) 3844–3852 (Curran Associates, 2016).

Velickovic, P. et al. Graph attention networks. In International Conference on Learning Representations (2018).

Hu, Z., Dong, Y., Wang, K. & Sun, Y. Heterogeneous graph transformer. In Proc. ACM Web Conference 2704–2710 (ACM, 2020).

Yun, S., Jeong, M., Kim, R., Kang, J. & Kim, H. J. Graph transformer networks. In Proc. 33rd Conference on Neural Information Processing Systems (eds Wallach, H. et al.) (Curran Associates, 2019).

Yan, S., Xiong, Y. & Lin, D. Spatial temporal graph convolutional networks for skeleton-based action recognition. In Proc. AAAI Conference on Artificial Intelligence (eds McIlraith, S. A. & Weinberger, K. Q.) 7444–7452 (AAAI, 2018).

Choi, E. et al. Learning the graphical structure of electronic health records with graph convolutional transformer. In Proc. AAAI Conference on Artificial Intelligence Vol. 34, 606–613 (AAAI, 2020).

Xu, K. et al. Representation learning on graphs with jumping knowledge networks. In Proc. 35th International Conference on MachineLearning 5482–5493 (PMLR, 2018).

Abu-El-Haija, S. et al. MixHop: higher-order graph convolutional architectures via sparsified neighborhood mixing. In Proc. 36th International Conference on MachineLearning (PMLR, 2019).

Ying, Z. et al. Hierarchical graph representation learning with differentiable pooling. In Proc. 32nd International Conference on Neural Information Processing Systems (eds Bengio, S et al.) 4805–4815 (Curran Associates, 2018).

Schütt, K. et al. Schnet: a continuous-filter convolutional neural network for modeling quantum interactions. In 31st Conference on Neural Information Processing Systems (eds Guyon, I. et al.) (Curran Associates, 2017).

Klicpera, J., Groß, J. & Günnemann, S. Directional message passing for molecular graphs. In International Conference on Learning Representations (2020).

Chiang, W. et al. Cluster-GCN: an efficient algorithm for training deep and large graph convolutional networks. In Proc. ACM SIGKDD Conference on Knowledge Discovery and Data Mining 257–266 (ACM, 2019).

Zeng, H., Zhou, H., Srivastava, A., Kannan, R. & Prasanna, V. K. GraphSAINT: graph sampling based inductive learning method. In International Conference on Learning Representations (2020).

Schlichtkrull, M. et al. Modeling relational data with graph convolutional networks. In European Semantic Web Conference (eds Gangemi, A. et al.) 593–607 (Springer, 2018).

Wang, X. et al. Heterogeneous graph attention network. In Proc. ACM Web Conference (eds Liu, L. & White, R.) 2022–2032 (ACM, 2019).

Pareja, A. et al. EvolveGCN: evolving graph convolutional networks for dynamic graphs. In Proc. 34th AAAI Conference on Artificial Intelligence Vol. 34 5363–5370 (AAAI, 2020).

Rossi, E. et al. Temporal graph networks for deep learning on dynamic graphs. In International Conference on Machine Learning Workshop on Graph Representation Learning and Beyond (2020).

Huang, K. & Zitnik, M. Graph meta learning via local subgraphs. In Proc. International Conference on Neural Information Processing Systems (eds Larochelle, H. et al.) 5862–5874 (Curran Associates, 2020).

Hu, W. et al. Strategies for pre-training graph neural networks. In International Conference on Learning Representations (2020).

You, Y., Chen, T., Wang, Z. & Shen, Y. When does self-supervision help graph convolutional networks? In International Conference on Machine Learning (eds Daumé, H. & Singh, A.) 10871–10880 (JMLR, 2020).

Erdös, P. & Rényi, A. On the Rvolution of Random Graphs (Mathematical Institute of the Hungarian Academy of Sciences, 1960).

Albert, R. & Barabási, A.-L. Statistical mechanics of complex networks. Rev. Mod. Phys. 74 , 47–97 (2002).

Barabási A.-L. et al. Network Science (Cambridge Univ. Press, 2016).

Jin, W., Barzilay, R. & Jaakkola, T. S. Junction tree variational autoencoder for molecular graph generation. In International Conference on Machine Learning (PMLR, 2018).

Kipf, T. N. & Welling, M. Variational graph auto-encoders. In Advances in Neural Information Processing Systems Bayesian Deep Learning Workshop (2016).

Gómez-Bombarelli, R. et al. Automatic chemical design using a data-driven continuous representation of molecules. ACS Cent. Sci. 4 , 268–276 (2018).

Wang, H. et al. GraphGAN: Graph representation learning with generative adversarial nets. In Proc. 32nd AAAI Conference on Artificial Intelligence (AAAI, 2018).

Simonovsky, M. & Komodakis, N. GraphVAE: towards generation of small graphs using variational autoencoders. In International Conference on Artificial Neural Networks (Springer, 2018).

You, J., Liu, B., Ying, R., Pande, V. & Leskovec, J. Graph convolutional policy network for goal-directed molecular graph generation. In Proc. International Conference on Neural Information Processing Systems (eds Bengio S. et al.) (Curran Associates, 2018).

You, J., Ying, R., Ren, X., Hamilton, W. L. & Leskovec, J. GraphRNN: generating realistic graphs with deep auto-regressive models. In International Conference on Machine Learning (2018).

Yang, F., Fan, K., Song, D. & Lin, H. Graph-based prediction of protein–protein interactions with attributed signed graph embedding. BMC Bioinf. 21 , 323 (2020).

Huang, K., Xiao, C., Glass, L. M., Zitnik, M. & Sun, J. SkipGNN: predicting molecular interactions with skip-graph networks. Sci. Rep. 10 , 21092 (2020).

Yin, N. et al. Synergistic and antagonistic drug combinations depend on network topology. PLoS ONE 9 , e93960 (2014).

Fan, K. & Zhang, Y. Pseudo2GO: A graph-based deep learning method for pseudogene function prediction by borrowing information from coding genes. Front. Genet. 11 , 807 (2020).

Kearnes, S., McCloskey, K., Berndl, M., Pande, V. & Riley, P. Molecular graph convolutions: moving beyond fingerprints. J. Comput. Aided Mol. Des. 30 , 595–608 (2016).

Dutil, F., Cohen, J. P., Weiss, M., Derevyanko, G. & Bengio, Y. Towards gene expression convolutions using gene interaction graphs. In International Conference on Machine Learning Workshop on Computational Biology (2018).

Hamilton, W. L., Ying, Z. & Leskovec, J. Inductive representation learning on large graphs. In Proc. International Conference on Neural Information Processing Systems (eds Guyon, I. et al.) (Curran Associates, 2017).

Huan, J. et al. Comparing graph representations of protein structure for mining family-specific residue-based packing motifs. J. Comput. Biol. 12 , 657–671 (2005).

Fout, A., Byrd, J., Shariat, B. & Ben-Hur, A. Protein interface prediction using graph convolutional networks. In Proc. International Conference on Neural Information Processing Systems (eds Guyon, I. et al.) (Curran Associates, 2019).

Ingraham, J., Garg, V. K., Barzilay, R. & Jaakkola, T. S. Generative models for graph-based protein design. In Proc. International Conference on Neural Information Processing Systems (eds Wallach, H. et al.) (Curran Associates, 2019).

Jin, W., Barzilay, R. & Jaakkola, T. Hierarchical generation of molecular graphs using structural motifs. In International Conference on Machine Learning 4552–4561 (2020).

Elton, D. C., Boukouvalas, Z., Fuge, M. D. & Chung, P. W. Deep learning for molecular design—a review of the state of the art. Mol. Syst. Des. Eng. 4 , 828–849 (2019).

Guo, X. & Zhao, L. A systematic survey on deep generative models for graph generation. In Association for Computing Machinery (ACM, 2020).

Gainza, P. et al. Deciphering interaction fingerprints from protein molecular surfaces using geometric deep learning. Nat. Methods 17 , 184–192 (2020).

Cao, Y. & Shen, Y. Energy-based graph convolutional networks for scoring protein docking models. Proteins Struct. Funct. Bioinf. 88 , 1091–1099 (2020).

Luck, K. et al. A reference map of the human binary protein interactome. Nature 580 , 402–408 (2020).

Li, D. & Gao, J. Towards perturbation prediction of biological networks using deep learning. Sci. Rep. 9 , 11941 (2019).

Liu, Y., Yuan, H., Cai, L., & Ji, S. Deep learning of high-order interactions for protein interface prediction. In Proc. ACM SIGKDD Conference on Knowledge Discovery and Data Mining 679–687 (ACM, 2020).

Yao, H., Guan, J. & Liu, T. Denoising protein-protein interaction network via variational graph auto-encoder for protein complex detection. J. Bioinform. Comput. Biol. 18 , 2040010 (2020).

Moreau, Y. & Tranchevent, L.-C. Computational tools for prioritizing candidate genes: boosting disease gene discovery. Nat. Rev. Genet. 13 , 523–536 (2012).

Zitnik, M. et al. Gene prioritization by compressive data fusion and chaining. PLoS Comput. Biol. 11 , 1004552 (2015).

Zhou, H., Beltran, F. & Brito, I. L. Functions predict horizontal gene transfer and the emergence of antibiotic resistance. Sci. Adv . 7 , eabj5056 2021.

The Gene Ontology Consortium. The Gene Ontology resource: 20 years and still GOing strong. Nucleic Acids Res 47 , D330–D338 (2019).

Zhou, G., Wang, J., Zhang, X. & Yu, G. DeepGOA: predicting gene ontology annotations of proteins via graph convolutional network. In IEEE International Conference on Bioinformatics and Biomedicine (BIBM) 1836–1841 (IEEE, 2019).

Fan, K., Guan, Y. & Zhang, Y. Graph2GO: a multi-modal attributed network embedding method for inferring protein functions. GigaScience 9 , giaa081 (2020).

Hasibi, R. & Michoel, T. A Graph feature auto-encoder for the prediction of unobserved node features on biological networks. BMC Bioinform 22 , 525 (2021).

Cao, M. et al. Going the distance for protein function prediction: a new distance metric for protein interaction networks. PLoS ONE 8 , e76339 (2013).

Dey, T. K. & Mandal, S. Protein classification with improved topological data analysis. In 18th International Workshop on Algorithms in Bioinformatics (WABI) (eds Parida, L. & Ukkonen, E.) 6:1–6:13 (Schloss Dagstuhl–Leibniz-Zentrum fuer Informatik, 2018).

Martino, A., Rizzi, A. & Mascioli, F. M. F. Supervised approaches for protein function prediction by topological data analysis. In 2018 International Joint Conference on Neural Networks (IJCNN) (IEEE, 2018).

Nambiar, A. et al. Transforming the language of life: transformer neural networks for protein prediction tasks. In 11th ACM International Conference on Bioinformatics, Computational Biology and Health Informatics (ACM, 2020).

Vig, J. et al. Bertology meets biology: interpreting attention in protein language models. In International Conference on Learning Representations (2021).

Han, P. et al. GCN-MF: disease–gene association identification by graph convolutional networks and matrix factorization. In Proc. ACM SIGKDD Conference on Knowledge Discovery and Data Mining 705–713 (ACM, 2019).

Mandal, S., Guzmán-Sáenz, A., Haiminen, N., Basu, S. & Parida, L. A topological data analysis approach on predicting phenotypes from gene expression data. In International Conference on Algorithms for Computational Biology (eds Martín-Vide, C. et al.) 178–187 (Springer, 2020).

Nicolau, M., Levine, A. J. & Carlsson, G. Topology based data analysis identifies a subgroup of breast cancers with a unique mutational profile and excellent survival. Proc. Natl Acad. Sci. USA 108 , 7265–7270 (2011).

Yang, C., Zhuang, P., Shi, W., Luu, A. & Li, P. Conditional structure generation through graph variational generative adversarial nets. In Proc. International Conference on Neural Information Processing Systems (eds Wallach, H. et al.) 124632 (Curran Associates, 2019).

Chereda, H., Bleckmann, A., Kramer, F., Leha, A. & Beissbarth, T. in German Medical Data Sciences: Shaping Change—Creative Solutions for Innovative Medicine (eds Röhrig, R. et al) 181–186 (GMDS, 2019).

Crawford, J. & Greene, C. S. Incorporating biological structure into machine learning models in biomedicine. Curr. Opin. Biotechnol. 63 , 126–134 (2020).

Rhee, S., Seo, S. & Kim, S. Hybrid approach of relation network and localized graph convolutional filtering for breast cancer subtype classification. In Proc. Twenty-Seventh International Joint Conference on Artificial Intelligence 3527–3534 (International Joint Conferences on Artificial Intelligence Organization, 2018).

Ramirez, R. et al. Classification of cancer types using graph convolutional neural networks. Front. Phys. 8 , 203 (2020).

Liu, S., Grau, B., Horrocks, I. & Kostylev, E. INDIGO: GNN-based inductive knowledge graph completion using pair-wise encoding. In Proc. International Conference on Neural Information Processing Systems (eds Ranzato, S. et al.) 2034–2045 (Curran Associates, 2021).

Rizvi, A. H. et al. Single-cell topological rna-seq analysis reveals insights into cellular differentiation and development. Nat. Biotechnol. 35 , 551–560 (2017).

Burkhardt, D. B. et al. Quantifying the effect of experimental perturbations at single-cell resolution. Nat. Biotechnol. 39 , 619–629 (2021).

Ravindra, N., Sehanobish, A., Pappalardo, J. L., Hafler, D. A. & van Dijk, D. Disease state prediction from single-cell data using graph attention networks. In Conference on Health, Inference, and Learning 121–130 (ACM, 2020).

Huang, K. scGNN: scRNA-seq dropout imputation via induced hierarchical cell similarity graph. In International Conference on Machine Learning Workshop on Computational Biology (2020).

Wang, J. et al. scGNN is a novel graph neural network framework for single-cell RNA-seq analyses. Nat. Commun. 12 , 1882 (2021).

Chen, H., Ryu, J., Vinyard, M. E., Lerer, A. & Pinello, L. SIMBA: Single-cell embedding along with features. Preprint at bioRxiv https://doi.org/10.1101/2021.10.17.464750 (2021).

Buterez, D., Bica, I., Tariq, I., Andrés-Terré, H. & Liò, P. CellVGAE: an unsupervised scRNA-seq analysis workflow with graph attention networks. Bioinformatics 38 , 1277–1286 (2021).

Marx, V. Method of the year: spatially resolved transcriptomics. Nat. Methods 18 , 9–14 (2021).

Yuan, Y. & Bar-Joseph, Z. GCNG: graph convolutional networks for inferring gene interaction from spatial transcriptomics data. Genome Biol 21 , 300 (2020).

Partel, G. & Wählby, C. Spage2vec: Unsupervised representation of localized spatial gene expression signatures. FEBS J 288 , 1859–1870 (2021).

Meinshausen, N. et al. Methods for causal inference from gene perturbation experiments and validation. Proc. Natl Acad. Sci. USA 113 , 7361–7368 (2016).

Guney, E., Menche, J., Vidal, M. & Barábasi, A.-L. Network-based in silico drug efficacy screening. Nat. Commun. 7 , 10331 (2016).

Cheng, F. et al. Network-based approach to prediction and population-based validation of in silico drug repurposing. Nat. Commun. 9 , 2691 (2018).

Stokes, J. M. et al. A deep learning approach to antibiotic discovery. Cell 180 , 688–702.e13 (2020).

Coley, C. W. et al. A graph-convolutional neural network model for the prediction of chemical reactivity. Chem. Sci. 10 , 370–377 (2019).

Xie, Y. et al. MARS: Markov molecular sampling for multi-objective drug discovery. In International Conference on Learning Representations (2021).

Alagappan, M., Jiang, D., Denko, N. & Koong, A. C. in Tumor Microenvironment (eds Koumenis, C. et al.) 253–268 (Springer, 2016).

Thafar, M. A. et al. DTiGEMS+: drug–target interaction prediction using graph embedding, graph mining, and similarity-based techniques. J. Cheminformatics 12 , 44 (2020).

Thafar, M. A. et al. DTi2Vec: Drug–target interaction prediction using network embedding and ensemble learning. J. Cheminformatics 13 , 71 (2021).

Ma, T., Xiao, C., Zhou, J. & Wang, F. Drug similarity integration through attentive multi-view graph auto-encoders. In Proc. Twenty-Seventh International Joint Conference on Artificial Intelligence (ed. Lang, J.) 3477–3483 (AAAI, 2018).

Jiang, M. et al. Drug–target affinity prediction using graph neural network and contact maps. RSC Adv 10 , 20701–20712 (2020).

Chen, L. et al. TransformerCPI: improving compound–protein interaction prediction by sequence-based deep learning with self-attention mechanism and label reversal experiments. Bioinformatics 36 , 4406–4414 (2020).

Quan, Z., Guo, Y., Lin, X., Wang, Z.-J. & Zeng, X. GraphCPI: Graph neural representation learning for compound-protein interaction. In BIBM 717–722 (2019).

Tsubaki, M., Tomii, K. & Sese, J. Compound–protein interaction prediction with end-to-end learning of neural networks for graphs and sequences. Bioinformatics 35 , 309–318 (2019).

Lin, X. et al. DeepGS: Deep representation learning of graphs and sequences for drug-target binding affinity prediction. ECAI (2020).

Guo, Z.-H. et al. MeSHHeading2vec: a new method for representing mesh headings as vectors based on graph embedding algorithm. Brief. Bioinform. 21 , 1641–1662 (2020).

Wang, R., Li, S., Cheng, L., Wong, M. H. & Leung, K. S. Predicting associations among drugs, targets and diseases by tensor decomposition for drug repositioning. BMC Bioinf. 20 , 628 (2019).

Wan, F., Hong, L., Xiao, A., Jiang, T. & Zeng, J. NeoDTI: neural integration of neighbor information from a heterogeneous network for discovering new drug–target interactions. Bioinformatics 35 , 104–111 (2019).

Xie, Y., Peng, J. & Zhou, Y. Integrating protein–protein interaction information into drug response prediction by graph neural encoding. Preprint at Research Square https://doi.org/10.21203/rs.2.18936/v1 (2019).

Alsentzer, E., Finlayson, S. G., Li, M. M. & Zitnik, M. Subgraph neural networks. In Proc. International Conference on Neural Information Processing Systems (eds Larochelle, H. et al.) 8017–8029 (Curran Associates, 2020).

Buphamalai, P., Kokotovic, T., Nagy, V. & Menche, J. Network analysis reveals rare disease signatures across multiple levels of biological organization. Nat. Commun. 12 , 6306 (2021).

Barisoni, L., Lafata, K. J., Hewitt, S. M., Madabhushi, A. & Balis, U. G. Digital pathology and computational image analysis in nephropathology. Nat. Rev. Nephrol. 16 , 669–685 (2020).

Gurcan, M. N. et al. Histopathological image analysis: a review. IEEE Rev. Biomed. Eng. 2 , 147–171 (2009).

Choi, E., Bahadori, M. T., Song, L., Stewart, W. F. & Sun, J. GRAM: graph-based attention model for healthcare representation learning. In Proc. 23rd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining 787–795 (ACM, 2017).

Li, Y., Qian, B., Zhang, X. & Liu, H. Graph neural network-based diagnosis prediction. Big Data 8 , 379–390 (2020).

Bonito, P. et al. in Uncertainty for Safe Utilization of Machine Learning in Medical Imaging, and Graphs in Biomedical Image Analysis (eds Sudre, C. H. et al.) 208–219 (Springer, 2020).

Adnan, M., Kalra, S. & Tizhoosh, H. R. Representation learning of histopathology images using graph neural networks. In Proc. IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops 4254–4261 (IEEE, 2020).

Anand, D., Gadiya, S. & Sethi, A. Histographs: graphs in histopathology. In Medical Imaging 2020: Digital Pathology (eds Tomaszewski, J. E. & Ward, A. D.) (SPIE, 2020).

Zhou, Y. et al. CGC-Net: cell graph convolutional network for grading of colorectal cancer histology images. In IEEE International Conference on Computer Vision Workshops (IEEE, 2019).

Jaume, G. et al. Towards explainable graph representations in digital pathology. In International Conference on Machine Learning Workshop on Computational Biology (2020).

Chao, C.-H. et al. Lymph node gross tumor volume detection in oncology imaging via relationship learning using graph neural network. In 23rd International Conference on Medical Image Computing and Computer Assisted Intervention (eds Martel, A. L. et al.) 772–782 (Springer-Verlag, 2020).

An, X., Zhou, Y., Di, Y. & Ming, D. Dynamic functional connectivity and graph convolution network for Alzheimer’s disease classification. In 7th International Conference on Biomedical and Bioinformatics Engineering 1–4 (ACM, 2020).

Song, T.-A. et al. Graph convolutional neural networks for Alzheimer’s disease classification. In IEEE 16th International Symposium on Biomedical Imaging (ISBI 2019) 414–417 (IEEE, 2019).

Wee, C.-Y. et al. Cortical graph neural network for ad and mci diagnosis and transfer learning across populations. NeuroImage Clin 23 , 101929 (2019).

Mao, C., Yao, L. & Luo, Y. ImageGCN: Multi-relational image graph convolutional networks for disease identification with chest x-rays. IEEE Trans. Med. Imaging 41 , 1990–2003 (2022).

Levy, J., Haudenschild, C., Bar, C., Christensen, B. & Vaickus, L. Topological feature extraction and visualization of whole slide images using graph neural networks. Pac. Symp. Biocomput. 26 , 285–296 (2021).

Hu, J. et al. SpaGCN: integrating gene expression, spatial location and histology to identify spatial domains and spatially variable genes by graph convolutional network. Nat. Methods 18 , 1342–1351 (2021).

Rotmensch, M., Halpern, Y., Tlimat, A., Horng, S. & Sontag, D. Learning a health knowledge graph from electronic medical records. Sci. Rep. 7 , 5994 (2017).

Wu, T. et al. Representation learning of EHR data via graph-based medical entity embedding. In Advances in Neural Information Processing Systems Graph Representation Learning Workshop (2019).

Mao, C., Yao, L. & Luo, Y. MedGCN: medication recommendation and lab test imputation via graph convolutional networks. J. Biomed. Inform. 127 , 104000 (2022).

Ma, F. et al. KAME: knowledge-based attention model for diagnosis prediction in healthcare. In 27th ACM International Conference on Information and Knowledge Management 743–752 (ACM, 2018).

Sun, Z. et al. Disease prediction via graph neural networks. IEEE J. Biomed. Health Inform. 25 , 818–826 (2020).

Chen, I. Y., Agrawal, M., Horng, S. & Sontag, D. Robustly extracting medical knowledge from EHRs: a case study of learning a health knowledge graph. In Pac. Symp. Biocomput . 25 , 19–30 (2020).

Chowdhury, S., Zhang, C., Yu, P. S. & Luo, Y. Mixed pooling multi-view attention autoencoder for representation learning in healthcare. Preprint at https://arxiv.org/abs/1910.06456 (2019).

Liu, S. et al. A hybrid method of recurrent neural network and graph neural network for next-period prescription prediction. Int. J. Mach. Learn. Cybern. 11 , 2849–2856 (2020).

Lee, D., Jiang, X. & Yu, H. Harmonized representation learning on dynamic her graphs. J. Biomed. Inform. 106 , 103426 (2020).

Tong, C., Rocheteau, E., Veličković, P., Lane, N. & Liò, P. in AI for DiseaseSurveillance and Pandemic Intelligence. W3PHAI 2021. Studies in Computational Intelligence Vol. 1013 (eds Shaban-Nejad, A. et al.) 281–293 (Springer, 2022).

Kwak, H. et al. Drug-disease graph: predicting adverse drug reaction signals via graph neural network with clinical data. In Pacific-Asia Conference on Knowledge Discovery and Data Mining 633–644 (Springer-Verlag, 2020).

Zhao, C., Jiang, J., Guan, Y., Guo, X. & He, B. EMR-based medical knowledge representation and inference via Markov random fields and distributed representation learning. Artif. Intell. Med. 87 , 49–59 (2018).

Li, L. et al. A method to learn embedding of a probabilistic medical knowledge graph: algorithm development. JMIR Med. Inf. 8 , e17645 (2020).

Hosseini, A., Chen, T., Wu, W., Sun, Y. & Sarrafzadeh, M. HeteroMed: heterogeneous information network for medical diagnosis. In International Conference on Information and Knowledge Management 763–772 (ACM, 2018).

Shang, J., Xiao, C., Ma, T., Li, H. & Sun, J. GameNet: graph augmented memory networks for recommending medication combination. In Proc. AAAI Conference on Artificial Intelligence 1126–1133 (AAAI, 2019).

Wu, S., Chen, D. & Snyder, M. P. Network biology bridges the gaps between quantitative genetics and multi-omics to map complex diseases. Curr. Opin. Chem. Biol. 66 , 102101 (2022).

Umans, B. D., Battle, A. & Gilad, Y. Where are the disease-associated eQTLs? Trends Genet 37 , 109–124 (2020).

Wang, T., Peng, Q., Liu, B., Liu, Y. & Wang, Y. Disease module identification based on representation learning of complex networks integrated from GWAS, eQTL summaries, and human interactome. Front. Bioeng. Biotechnol. 8 , 418 (2020).

Dekker, J. & Misteli, T. Long-range chromatin interactions. Cold Spring Harb. Perspect. Biol. 7 , a019356 (2015).

Lanchantin, J. & Qi, Y. Graph convolutional networks for epigenetic state prediction using both sequence and 3D genome data. Bioinformatics 36 , i659–i667 (2020).

Hovenga, V., Oluwadare, O. & Kalita, J. Hic-GNN: a generalizable model for 3D chromosome reconstruction using graph convolutional neural networks. Preprint at bioRxiv https://doi.org/10.1101/2021.11.29.470405 (2021).

Ding, J., Sharon, N. & Bar-Joseph, Z. Temporal modelling using single-cell transcriptomics. Nat. Rev. Genet. 23 , 355–368 (2022).

Fortelny, N. & Bock, C. Knowledge-primed neural networks enable biologically interpretable deep learning on single-cell sequencing data. Genome Biol 21 , 190 (2020).

Machens, A. et al. An infectious disease model on empirical networks of human contact: bridging the gap between dynamic network data and contact matrices. BMC Infect. Dis. 13 , 185 (2013).

Ying, Z., Bourgeois, D., You, J., Zitnik, M. & Leskovec, J. GNNExplainer: generating explanations for graph neural networks. In Proc. International Conference on Neural Information Processing Systems (ed Wallach, H. et al.) (Curran Associates, 2019).

Agarwal, C., Lakkaraju, H. & Zitnik, M. Towards a unified framework for fair and stable graph representation learning. In Proc. Machine Learning Research (eds de Campos, C. & Maathuis, M. H.) 2114–2124 (PMLR, 2021).

Zhang, X. & Zitnik, M. GNNGuard: defending graph neural networks against adversarial attacks. In Proc. International Conference on Neural Information Processing Systems (eds Larochelle, H. et al.) 9263–9275 (Curran Associates, 2020).

Obermeyer, Z., Powers, B., Vogeli, C. & Mullainathan, S. Dissecting racial bias in an algorithm used to manage the health of populations. Science 366 , 447–453 (2019).

Xu, K. et al. Representation learning on graphs with jumping knowledge networks. In Proc. 35th International Conference on MachineLearning (PMLR, 2018).

Stoeckius, M. et al. Simultaneous epitope and transcriptome measurement in single cells. Nat. Methods 14 , 865–868 (2017).

Kondratova, M. et al. A multiscale signalling network map of innate immune response in cancer reveals cell heterogeneity signatures. Nat. Commun. 10 , 4808 (2019).

Mohammadi, S., Davila-Velderrain, J. & Kellis, M. Reconstruction of cell-type-specific interactomes at single-cell resolution. Cell Syst 9 , 559–568.e4 (2019).

Li, M. M. & Zitnik, M. Deep contextual learners for protein networks. In International Conference on Machine Learning Workshop on Computational Biology (2021).

Stark, C. et al. BioGRID: a general repository for interaction datasets. Nucleic Acids Res 34 , D535–D539 (2006).

Efremova, M., Vento-Tormo, M., Teichmann, S. A. & Vento-Tormo, R. Cellphonedb: inferring cell–cell communication from combined expression of multi-subunit ligand–receptor complexes. Nat. Protoc. 15 , 1484–1506 (2020).

Browaeys, R., Saelens, W. & Saeys, Y. NicheNet: modeling intercellular communication by linking ligands to target genes. Nat. Methods 17 , 159–162 (2020).

Zhang, Y. & Zhang, Z. The history and advances in cancer immunotherapy: understanding the characteristics of tumor-infiltrating immune cells and their therapeutic implications. Cell. Mol. Immunol. 17 , 807–821 (2020).

Alessandri, L. et al. Sparsely-connected autoencoder (SCA) for single cell RNAseq data mining. npj Syst. Biol. Appl. 7 , 1 (2021).

Tran, D. et al. Fast and precise single-cell data analysis using a hierarchical autoencoder. Nat. Commun. 12 , 1029 (2021).

Zitnik, M., Agrawal, M. & Leskovec, J. Modeling polypharmacy side effects with graph convolutional networks. Bioinformatics 34 , i457–i466 (2018).

Jiang, P. et al. Deep graph embedding for prioritizing synergistic anticancer drug combinations. Comput. Struct. Biotechnol. J. 18 , 427–438 (2020).

Kim, Y. et al. Anticancer drug synergy prediction in understudied tissues using transfer learning. J. Am. Med. Iinf. Assoc. 28 , 42–51 (2021).

National Research Council (US) Committee on A Framework for Developing a New Taxonomy of Disease Toward Precision Medicine: Building a Knowledge Network for Biomedical Research and a new Taxonomy of Disease (National Academies Press, 2011).

Download references

Acknowledgements

We gratefully acknowledge the support of the National Science Foundation, under grants IIS-2030459 and IIS-2033384, the US Air Force Contract No. FA8702-15-D-0001, the Harvard Data Science Initiative, and awards from Amazon Research, Bayer Early Excellence in Science, AstraZeneca Research, and Roche Alliance with Distinguished Scientists. M.M.L. is supported by T32HG002295 from the National Human Genome Research Institute and a National Science Foundation Graduate Research Fellowship. Any opinions, findings, conclusions or recommendations expressed in this article are those of the authors and do not necessarily reflect the views of the funders.

Author information

Authors and affiliations.

Bioinformatics and Integrative Genomics Program, Harvard Medical School, Boston, MA, USA

Michelle M. Li

Department of Biomedical Informatics, Harvard Medical School, Boston, MA, USA

Michelle M. Li & Marinka Zitnik

Health Data Science Program, Harvard T.H. Chan School of Public Health, Boston, MA, USA

  • Kexin Huang

Broad Institute of MIT and Harvard, Cambridge, MA, USA

  • Marinka Zitnik

Harvard Data Science Initiative, Cambridge, MA, USA

You can also search for this author in PubMed   Google Scholar

Contributions

M.M.L. and M.Z. conceived the work and shaped its framing. M.M.L. performed background research and wrote the manuscript together with K.H. and M.Z. All authors discussed the content, and reviewed and edited the manuscript.

Corresponding author

Correspondence to Marinka Zitnik .

Ethics declarations

Competing interests.

The authors declare no competing interests.

Peer review

Peer review information.

Nature Biomedical Engineering thanks Feixiong Cheng, Fabian Theis and the other, anonymous, reviewer(s) for their contribution to the peer review of this work.

Additional information

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Supplementary information.

Supplementary notes, figures and references.

Rights and permissions

Springer Nature or its licensor holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Cite this article.

Li, M.M., Huang, K. & Zitnik, M. Graph representation learning in biomedicine and healthcare. Nat. Biomed. Eng 6 , 1353–1369 (2022). https://doi.org/10.1038/s41551-022-00942-x

Download citation

Received : 30 October 2021

Accepted : 09 August 2022

Published : 31 October 2022

Issue Date : December 2022

DOI : https://doi.org/10.1038/s41551-022-00942-x

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

This article is cited by

Mapping cell-to-tissue graphs across human placenta histology whole slide images using deep learning with happy.

  • Claudia Vanea
  • Jelisaveta Džigurski
  • Christoffer Nellåker

Nature Communications (2024)

Causal diagramming for assessing human system risk in spaceflight

  • Erik Antonsen
  • Robert J. Reynolds
  • Daniel M. Buckland

npj Microgravity (2024)

A software resource for large graph processing and analysis

Nature Computational Science (2023)

Building a knowledge graph to enable precision medicine

  • Payal Chandak

Scientific Data (2023)

Speos: an ensemble graph representation learning framework to predict core gene candidates for complex diseases

  • Florin Ratajczak
  • Mitchell Joblin
  • Matthias Heinig

Nature Communications (2023)

Quick links

  • Explore articles by subject
  • Guide to authors
  • Editorial policies

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

graph neural network representation learning

FL-GNNs: Robust Network Representation via Feature Learning Guided Graph Neural Networks

Ieee account.

  • Change Username/Password
  • Update Address

Purchase Details

  • Payment Options
  • Order History
  • View Purchased Documents

Profile Information

  • Communications Preferences
  • Profession and Education
  • Technical Interests
  • US & Canada: +1 800 678 4333
  • Worldwide: +1 732 981 0060
  • Contact & Support
  • About IEEE Xplore
  • Accessibility
  • Terms of Use
  • Nondiscrimination Policy
  • Privacy & Opting Out of Cookies

A not-for-profit organization, IEEE is the world's largest technical professional organization dedicated to advancing technology for the benefit of humanity. © Copyright 2024 IEEE - All rights reserved. Use of this web site signifies your agreement to the terms and conditions.

Enhanced Graph Representations for Graph Convolutional Network Models

  • 1222: Intelligent Multimedia Data Analytics and Computing
  • Published: 02 February 2022
  • Volume 82 , pages 9649–9666, ( 2023 )

Cite this article

graph neural network representation learning

  • Vandana Bhattacharjee   ORCID: orcid.org/0000-0002-0680-2691 1 ,
  • Raj Sahu 1 &
  • Amit Dutta 1  

363 Accesses

1 Altmetric

Explore all metrics

Graph Convolutional Network (GCN) is increasingly becoming popular among researchers for its capability of solving the task of classification of nodes, graphs or links. Graphs being a very useful representation for several application domains are increasingly grabbing the attention of researchers. Methods are being proposed to extract meaningful information in a form which can be used by machine learning tasks. Graph convolutional networks(GCN) fall among such methods. They propagate and transform node features information. Following the message passing strategy, a Graph neural network learns a node’s embeddings representations by aggregating representations of its neighbours and itself. In this research work we incorporate the concept of overlap to the graph data thus capturing the structural similarities in the node features. The intuition behind this proposal is that the class or label of a document represented by node \({v}_{i}\) is influenced by its own node features and the node features of its neighbourhood. It is proposed to enhance the graph representation to capture this neighbourhood. This enhanced graph is then input to the graph convolutional network model for the classification task. These measures are seen to improve the accuracy of node classification. Experiments on a number of datasets with different similarity measures demonstrate that enhancing graph representations produces better results in terms of classification accuracy.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price includes VAT (Russian Federation)

Instant access to the full article PDF.

Rent this article via DeepDyve

Institutional subscriptions

graph neural network representation learning

Similar content being viewed by others

graph neural network representation learning

Graph convolutional networks: a comprehensive review

Si Zhang, Hanghang Tong, … Ross Maciejewski

graph neural network representation learning

Modeling Relational Data with Graph Convolutional Networks

graph neural network representation learning

Multi-scale Dilated Attention Graph Convolutional Network for Skeleton-Based Action Recognition

Data availability.

Public datasets have been used.

Code availability

Code can be made available on request.

Belkin M, Niyogi P, Sindhwani V (2006) Manifold regularization: A geometric framework for learning from labeled and unlabeled examples. J Mach Learn Res (JMLR) 7(Nov):2399–2434

MathSciNet   MATH   Google Scholar  

Berger-Wolf T, Taheri A, Gimpel K (2018) Learning graph representations with recurrent neural network autoencoders. In: KDD’18.

Chen L-C, Papandreou G, Kokkinos I, Murphy K, Yuille AL (2018) Deeplab: Semantic image segmentation with deep convolutional nets, atrous convolution, and fully connected CRFs. IEEE Trans Pattern Anal Mach Intell 40(4):834–848. https://doi.org/10.1109/TPAMI.2017.2699184 . Epub 2017 Apr 27.

Cho K, van Merriënboer B, Bahdanau D, Bengio Y (2014) On the properties of neural machine translation: encoder–decoder approaches. Syntax, Semantics and Structure in Statistical Translation, p 103

Clough JR, Gollings J, Loach TV, Evans TS (2015) Transitive reduction of citation networks. J Complex Netw 3(2):189–203

Article   MathSciNet   Google Scholar  

Cui Z, Henrickson K, Ke R (2018) Traffic graph convolutional recurrent neural network: a deep learning framework for network-scale traffic learning and forecasting. arXiv preprintarXiv:1802.07007

Dai H, Dai B, Song L (2016) Discriminative embeddings of latent variable models for structured data. Proceedings of the 33 rd International Conference on Machine Learning, New York, NY, USA, 2016

Defferrard M, Bresson X, Vandergheynst P (2016) Convolutional neural networks on graphs with fast localized spectral filtering. In: 30th Conference on Neural Information Processing Systems (NIPS 2016), Barcelona, Spain

Deng J, Dong W, Socher R, Li L-J, Li K, Fei Fei L (2009) ImageNet: A large-scale hierarchical image database. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition

Fang Y, Ronald R (2001) Lattices in citation networks: An investigation into the structure of citation graphs. Scientometrics 50(2):273–287

Gao H, Wang Z, Ji S (2018) Large-scale learnable graph convolutional networks. arXiv:1808.03965v1 [cs.LG]

Gehring J, Auli M, Grangier D, Dauphin YN (2017) A convolutional encoder model for neural machine translation. Annual Meeting of the Association for Computational Linguistics

Gong C, Tao D, Liu W, Liu L, Yang J (2017) Label propagation via teaching-to-learn and learning-to-teach. IEEE Trans Neural Netw Learn Syst 28 (2017):1452–1465

Gori M, Monfardini G, Scarselli F (2005) A new model for learning in graph domains. In: Neural Networks, 2005. IJCNN’05. Proceedings. 2005 IEEE International Joint Conference on, volume 2, pp 729–734

Grover A, Leskovec J (2016) node2vec: Scalable feature learning for networks. In: Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. ACM, New York

Hammond DK, Vandergheynst P, Gribonval R (2011) Wavelets on graphs via spectral graph theory. Appl Comput Harmon Anal 30(2):129–150

Hamilton WL, Ying R, Leskovec J (2017) Inductive representation learning on large graphs. In: Advances in Neural Information processing Systems

He K, Gkioxari G, Dollár P, Girshick R (2017) Mask r-cnn. IEEE International Conference on Computer Vision

He K, Zhang X, Ren S, Sun J (2016) Deep residual learning for image recognition. In: Proceedings of the IEEE conference on computer vision and pattern recognition, 770–778

Joan B, Zaremba W, Szlam A, LeCun Y (2014) Spectral networks and locally connected networks on graphs. CoRR abs/1312.6203

Karasuyama M, Mamitsuka H (2013) Manifold-based similarity adaptation for label propagation. In: Advances in Neural Information Processing Systems

Krizhevsky A, Sutskever I, Hinton GE (2012) Imagenet classification with deep convolutional neural networks. In: Advances in neural information processing systems, 1097–1105

LeCun Y, Bengio Y, Hinton G (2015) Deep learning. Nature 521(7553):436–444

Article   Google Scholar  

LeCun Y, Bottou L, Bengio Y, Haffner P (1998) Gradient-based learning applied to document recognition. Proc IEEE 86(11):2278–2324

Leo E, Ronald R (1990) Introduction to Informetrics: quantitative methods in library, documentation and information science. Elsevier Science Publishers, Amsterdam, p 228. ISBN 0-444-88493-9

Leow YY, Laurent T, Bresson X (2019) GraphTSNE: a visualization technique for graph-structured data. arXiv preprint arXiv:1904.06915

Li Q, Han Z, Wu XM (2018) Deeper insights into graph convolutional networks for semi-supervised learning. In: The 32nd AAAI Conference on Artificial Intelligence

Liao R, Zhao Z, Urtasun R, Zemel RS, Lanczosnet (2019) Multi-scale deep graph convolutional networks. In: Proceedings of the 7th International Conference on Learning Representations

Liu Y, Lee J, Park M, Kim S, Yang E, Hwang SJ, Yang Y (2019a) Learning to propagate labels: Transductive propagation network for few-shot learning. In: Proceedings of the 7th International Conference on Learning Representations

Luong M-T, Pham H, Manning CD (2015) Effective approaches to attention-based neural machine translation. Conference on Empirical Methods in Natural Language Processing

Mikolov T, Sutskever I, Chen K, Corrado GS, Dean J (2013) Distributed representations of words and phrases and their compositionality. In: Advances in neural information processing systems (NIPS), pp 3111–3119

Monti F, Bronstein M, Bresson X (2017) Geometric matrix completion with recurrent multi-graph neural networks. 31st Conference on Neural Information Processing Systems (NIPS 2017), Long Beach, CA, USA, pp 3697–3707

Perozzi B, Al-Rfou R, Skiena S (2014) Deepwalk: Online learning of social representations. In: Proceedings of the 20th ACM SIGKDD international conference on Knowledge discovery and data mining, pp 701–710. ACM, New York

Qu M, Bengio Y, Tang J, Gmnn (2019) Graph markov neural networks. In: Proceedings of the 36th International Conference on Machine Learning

Ren S, He K, Girshick R, Sun J (2015) Faster R-CNN: Towards real-time object detection with region proposal networks. In: Advances in neural information processing systems, 91–99

Seo Y, Defferrard M, Vandergheynst P (2018) Structured sequence modeling with graph convolutional recurrent networks. International Conference on Neural Information Processing, 362-373

Scarselli F, Gori M, Tsoi AC, Hagenbuchner M, Monfardini G (2009) The graph neural network model. IEEE Trans Neural Networks 20(1):61–80

Sen P, Namata G, Bilgic M, Getoor L, Galligher B, Eliassi-Rad T (2008) Collective classification in network data. AI Mag 29(3):93

Google Scholar  

Simonyan K, Zisserman A (2015) Very deep convolutional networks for large-scale image recognition. In: Proceedings of the International Conference on Learning Representations

Shi X, Lv F, Seng D, Zhang J, Chen J, Xing B (2021) Visualizing and understanding graph convolutional network. Multimed Tools Appl 80:8355–8375. https://doi.org/10.1007/s11042-020-09885-4

Srivastava N, Hinton G, Krizhevsky A, Sutskever I, Salakhutdinov R (2014) Dropout: A simple way to prevent neural networks from overfitting. J Mach Learn Res 15(1):1929–1958

Sun Y, Liang J, Niu P (2021) Personalized recommendation of english learning based on knowledge graph and graph convolutional network. In: Sun X, Zhang X, Xia Z, Bertino E (eds) Artificial Intelligence and Security. ICAIS 2021, vol 12737. Springer, Cham. https://doi.org/10.1007/978-3-030-78612-0_13

Szegedy C, Liu W, Jia Y, Sermanet P, Reed S, Anguelov D, Erhan D, Vanhoucke V, Rabinovich A (2015) Going deeper with convolutions. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 1–9

Tang J, Qu M, Wang M, Zhang M, Yan J, Mei Q (2015) Line: Large-scale information network embedding. In: Proceedings of the 24th International Conference on World Wide Web. ACM, New York, pp 1067–1077

Thomas K, Welling M (2017) Semi-Supervised Classification with Graph Convolutional Networks. In: Proceedings of the 5th International Conference on Learning representation

Veličković P, Cucurull G, Casanova A et al (2018) Graph attention networks. In: Proceedings of the International Conference on Learning Representations

Wang D, Cui P, Zhu W (2016) Structural deep network embedding. In: Proceedings of the 22nd ACM SIGKDD international conference on Knowledge discovery and data mining. ACM, New York, pp 1225–1234

Wang F, Zhang C (2008) Label propagation through linear neighborhoods. IEEE Trans Knowl Data Eng 20(1):55–67. https://doi.org/10.1109/TKDE.2007.190672

Wang H, Leskovec J (2020) Unifying graph convolutional neural networks and label propagation. arXiv:2002.06755v1 [cs.LG]

Wangzhong L, Janssen J, Milios E, Japkowic N, Yongzheng Z (2007) Node similarity in the citation graph. Knowl Inf Syst 11(1):105–129

Weston J, Ratle F, Mobahi H, Collobert R (2012) Deep learning via semi supervised embedding. Neural Networks: Tricks of the Trade. Springer, Berlin, pp 639–655

Xiao G, Wang R, Zhang C et al (2021) Demand prediction for a public bike sharing program based on spatio-temporal graph convolutional networks. Multimed Tools Appl 80:22907–22925. https://doi.org/10.1007/s11042-020-08803-y

Xiao L, Hu X, Chen Y et al (2020) Multi-head self-attention based gated graph convolutional networks for aspect-based sentiment classification. Multimed Tools Appl. https://doi.org/10.1007/s11042-020-10107-0

Xu K, Hu W, Leskovec J, Jegelka S (2019) How powerful are graph neural networks? In: Proceedings of the 7th International Conference on Learning Representations

Xu K, Li C, Tian Y, Sonobe T, Kawarabayashi K-i, Jegelka S (2018) Representation learning on graphs with jumping knowledge networks. In: Proceedings of the 35th International Conference on Machine Learning

Yi L, Su H, Guo X et al (2017) Syncspeccnn: synchronized spectral cnn for 3d shape segmentation. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 2282-2290

Ying R, He R, Chen K (2018) et.al., Graph convolutional networks for web-scale recommender systems. In: Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, 974-983

Yu B, Yin H, Zhu Z (2018) Spatio-temporal graph convolutional networks: a deep learning framework for traffic forecasting. In: Proceedings of the 27th International Joint Conference on Artificial Intelligence, 3634-3640

Yuan A, Jeannette J, Evangelos EM (2004) Characterizing and mining the citation graph of the computer science literature. Knowl Inf Syst 6(6):664–678

Zhang B, Liu M, Zhou B, Liu X (2021) Graph learning in low dimensional space for graph convolutional networks. Multimed Tools Appl. https://doi.org/10.1007/s11042-021-11033-5

Zhang Z, Wang J, Mlle (2007) Modified locally linear embedding using multiple weights. In: Adv Neural Inf Process Syst 19:1593–1600

Zhao, Dangzhi Z, Andreas S (2015) Analysis and visualization of citation networks. Morgan & Claypool Publishers, San Rafael. ISBN 978-1-60845-939-1

Zhou K, Song Q, Huang X, Hu X ( 2019) Auto-GNN: Neural architecture search of graph neural networks. arXiv:1909.03184v2 [cs.LG]

Zhu X, Ghahramani Z, Lafferty J (2003) Semi-supervised learning using gaussian fields and harmonic functions. In: International Conference on Machine Learning (ICML), vol 3, pp 912–919

Zhu X, Mao Z, Chen Z et al (2021) Object-difference drived graph convolutional networks for visual question answering. Multimed Tools Appl 80:16247–16265. https://doi.org/10.1007/s11042-020-08790-0

Zhu X, Lafferty J, Rosenfeld R (2005) Semi-supervised learning with graphs. PhD thesis, Carnegie Mellon University, school of language technologies institute

Zitnik M, Leskovec J (2017) Predicting multicellular function through multi-layer tissue networks. Bioinformatics 33(14):i190–i198

Download references

Acknowledgements

The authors gratefully acknowledge the valuable suggestions provided by the anonymous reviewers which greatly helped in preparing the paper in its present form.

The research has been supported by Birla Institute of Technology, Mesra, Ranchi.

Author information

Authors and affiliations.

Birla Institute of Technology, Mesra, Ranchi, India

Vandana Bhattacharjee, Raj Sahu & Amit Dutta

You can also search for this author in PubMed   Google Scholar

Contributions

The first author is responsible for conceptualization, problem definition, solution design and report writing. The second author is responsible for implementation and support in report writing. The third author is responsible for implementation and support in report writing.

Corresponding author

Correspondence to Vandana Bhattacharjee .

Ethics declarations

Conflicts of interest/competing interests, ethics approval.

Not applicable.

Consent to participate

All the authors are aware of this submission. They have reviewed and consented to participate in this journal submission.

Consent for publication

Additional information, publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Figs. 5 , 6 , and 7  present the accuracy and loss curves for all the datasets.   

figure 5

Accuracy and Loss curves of GCN Model for Cora Dataset ( Sim1 )

figure 6

Accuracy and Loss curves of GCN Model for Citeseer Dataset ( Sim1 )

figure 7

Accuracy and Loss curves of GCN Model for Pubmed Dataset ( Sim1 )

Rights and permissions

Reprints and permissions

About this article

Bhattacharjee, V., Sahu, R. & Dutta, A. Enhanced Graph Representations for Graph Convolutional Network Models. Multimed Tools Appl 82 , 9649–9666 (2023). https://doi.org/10.1007/s11042-021-11843-7

Download citation

Received : 16 March 2021

Revised : 05 August 2021

Accepted : 23 December 2021

Published : 02 February 2022

Issue Date : March 2023

DOI : https://doi.org/10.1007/s11042-021-11843-7

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Graph convolutional networks
  • Citation graphs
  • Overlap measures
  • Find a journal
  • Publish with us
  • Track your research

IMAGES

  1. What Are Graph Neural Networks? How GNNs Work, Explained with Examples

    graph neural network representation learning

  2. Graph Structure Learning for Robust Graph Neural Networks

    graph neural network representation learning

  3. Graph Neural Networks with Adaptive Residual

    graph neural network representation learning

  4. Introduction to Neural Networks with Scikit-Learn

    graph neural network representation learning

  5. Graph Neural Networks (GNN) using Pytorch Geometric

    graph neural network representation learning

  6. Neural Network: A Complete Beginners Guide

    graph neural network representation learning

VIDEO

  1. Intro to graph neural networks (ML Tech Talks)

  2. An Introduction to Graph Neural Networks: Models and Applications

  3. Understanding Graph Neural Networks

  4. Graph Neural Networks

  5. Stanford CS224W: ML with Graphs

  6. Stanford CS224W: Machine Learning with Graphs

COMMENTS

  1. Theory of Graph Neural Networks: Representation and Learning

    Graph Neural Networks (GNNs), neural network architectures targeted to learning representations of graphs, have become a popular learning model for prediction tasks on nodes, graphs and configurations of points, with wide success in practice. This article summarizes a selection of the emerging theoretical results on approximation and learning properties of widely used message passing GNNs and ...

  2. Graph Representation Learning Book

    The field of graph representation learning has grown at an incredible (and sometimes unwieldy) pace over the past seven years, transforming from a small subset of researchers working on a relatively niche topic to one of the fastest growing sub-areas of deep learning. ... Chapter 6: Graph Neural Networks in Practice [Draft. Updated September ...

  3. PDF Graph Representation Learning

    on graph representation learning, including techniques for deep graph embeddings, generalizations of convolutional neural networks to graph-structured data, and neural message-passing approaches inspired by belief propagation. These advances in graph representation learning have led to new state-of-the-art results in numerous domains,

  4. A Comprehensive Introduction to Graph Neural Networks (GNNs)

    Graph Neural Networks are special types of neural networks capable of working with a graph data structure. They are highly influenced by Convolutional Neural Networks (CNNs) and graph embedding. GNNs are used in predicting nodes, edges, and graph-based tasks. CNNs are used for image classification.

  5. A Comprehensive Survey on Deep Graph Representation Learning

    Inspired by the recent remarkable success of deep neural networks, a range of deep learning algorithms has been developed for graph-structured data learning. The core of these methods is to generate effective node and graph representations using graph neural networks (GNNs), followed by a goal-oriented learning paradigm.

  6. Graph Representation Learning

    This book is a foundational guide to graph representation learning, including state-of-the art advances, and introduces the highly successful graph neural network (GNN) formalism. Graph-structured data is ubiquitous throughout the natural and social sciences, from telecommunication networks to quantum chemistry. Building relational inductive ...

  7. Graph Representation Learning

    Recent years have seen a surge in research on graph representation learning, including techniques for deep graph embeddings, generalizations of convolutional neural networks to graph-structured data, and neural message-passing approaches inspired by belief propagation. These advances in graph representation learning have led to new state-of-the ...

  8. Graph neural network

    A graph neural network (GNN) belongs to a class of artificial neural networks for processing data that can be represented as graphs.. Basic building blocks of a graph neural network (GNN). () Permutation equivariant layer.() Local pooling layer.() Global pooling (or readout) layer.Colors indicate features.. In the more general subject of "geometric deep learning", certain existing neural ...

  9. Graph Representation Learning

    Graph representation learning, also known as network embedding, has been extensively studied in AI and data mining. In this chapter, we introduce a variety of graph representation learning methods that embed graph data into vectors with shallow or deep neural models.

  10. Graph Representation Learning

    A unified view on graph neural networks as graph signal denoising. arXiv (2020). Google Scholar; Yao Ma, Suhang Wang, Charu C Aggarwal, and Jiliang Tang. 2019. Graph convolutional networks with eigenpooling. In KDD. Google Scholar; Bryan Perozzi, Rami Al-Rfou, and Steven Skiena. 2014. Deepwalk: Online learning of social representations. In KDD '14.

  11. Representation Learning and Reasoning with Graph Neural Networks

    Graph Neural Networks (GNNs) are a powerful framework revolutionizing graph representation learning, but our understanding of their representational properties is limited. This project aims to explore the theoretical foundations of learning with graphs and relations in AI via the GNN architecture. In machine learning, a system can effectively ...

  12. GNNBook@2023: Representation Learning

    We summarize the representation learning techniques in different domains, focusing on the unique challenges and models for different data types including images, natural languages, speech signals and networks. At last, we summarize this chapter and provide further reading on mutual information-based representation learning, which is a recently ...

  13. Graph Transformer: Learning Better Representations for Graph Neural

    Graphs are widely used to model complex objects and their dependency relationships in many pattern recognition and machine learning tasks [].Along with recent success of deep learning networks, booming interests are focalized on utilizing these methods for analyzing large-scale and high-dimensional regular or Euclidean data [].Particularly, Convolutional Neural Networks (CNNs) [] have become ...

  14. Graph representation learning in biomedicine and healthcare

    Some aspects of graph representation learning have been covered extensively in the literature: deep learning on structured data 17,18; graph neural networks 19,20,21 (GNNs); representation ...

  15. GAN‐based deep neural networks for graph representation learning

    In this article, we propose a GAN-based deep neural network (DnnGAN) for graph representation learning. We adopt a structural deep autoencoder as a discriminator, which tries to restructure the neighbor relationship of the vertices and accurately predicts whether the input vertex pair is positive according to the middle layer representation.

  16. DPGNN: Dual-perception graph neural network for representation learning

    Graph neural networks (GNNs) have drawn increasing attention in recent years and achieved remarkable performance in many graph-based tasks, especially in semi-supervised learning on graphs. ... Case (2) shows that multi-space message interactions is complementary to graph representation learning, and our proposed message-passing paradigm can ...

  17. Dynamic Representation Learning via Recurrent Graph Neural Networks

    To alleviate this problem, this article proposes a representation learning model for dynamic graphs, called DynGNN. Differently, it is a single-stage model that embeds an RNN into a graph neural network to produce better representations in a compact form. This takes the fusion of temporal and topology correlations into account from low-level to ...

  18. FL-GNNs: Robust Network Representation via Feature Learning Guided

    FL-GNNs: Robust Network Representation via Feature Learning Guided Graph Neural Networks Abstract: Graph Neural Networks (GNNs) ... In this article, we propose a novel Feature Learning guided Graph Neural Networks (FL-GNNs) by incorporating robust feature learning into GNNs.

  19. Graph representation learning via simple jumping knowledge networks

    Recent graph neural networks for graph representation learning depend on a neighborhood aggregation process. Several works focus on simplifying the neighborhood aggregation process and model structures. However, as the depth of the models increases, the simplified models will encounter oversmoothing, resulting in a decrease in model performance. Several works leverage sophisticated learnable ...

  20. Scalable graph representation learning with Graph Neural Networks

    From thousands to billions: An overview of methods for scaling Graph Neural Networks. Representation learning also known as feature learning is a machine learning specialisation focused on ...

  21. Enhanced Graph Representations for Graph Convolutional Network Models

    Graph Convolutional Network (GCN) is increasingly becoming popular among researchers for its capability of solving the task of classification of nodes, graphs or links. Graphs being a very useful representation for several application domains are increasingly grabbing the attention of researchers. Methods are being proposed to extract meaningful information in a form which can be used by ...