Opportunities and challenges of graph neural networks in electrical engineering

Tanenbaum, A. S. Computer Networks (Pearson Education India, 2003).Shannon, C. E. Claude Elwood Shannon: Collected Papers (IEEE, 1993).Akpakwu, G. A., Silva, B. J., Hancke, G. P. & AbuMahfouz, A. M. A survey on 5G networks for the Internet of Things: communication technologies and challenges. IEEE Access 6, 3619–3647 (2017).Article 

Google Scholar 
Silver, D. et al. Mastering the game of Go without human knowledge. Nature 550, 354–359 (2017).Article 

Google Scholar 
Devlin, J., Chang, M.-W., Lee, K. & Toutanova, K. BERT: Pre-training of deep bidirectional transformers for language understanding. In Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (eds Burstein, J. et al.) 4171–4186 (ACL, 2019).OpenAI et al. Gpt-4 technical report. Preprint at arXiv https://doi.org/10.48550/arXiv.2303.08774 (2023).Gori, M., Monfardini, G. & Scarselli, F. A new model for learning in graph domains. In Proc. 2005 IEEE International Joint Conference on Neural Networks Vol. 2 729–734 (IEEE, 2005).Scarselli, F., Gori, M., Tsoi, A. C., Hagenbuchner, M. & Monfardini, G. The graph neural network model. IEEE Trans. Neural Netw. 20, 61–80 (2008).Article 

Google Scholar 
Kipf, T. N. & Welling, M. Variational graph autoencoders. In NIPS Workshop on Bayesian Deep Learning (NIPS, 2016).Shen, Y., Shi, Y., Zhang, J. & Letaief, K. B. A graph neural network approach for scalable wireless power control. In 2019 IEEE Globecom Workshops 1–6 (IEEE, 2019).Chowdhury, A., Verma, G., Rao, C., Swami, A. & Segarra, S. ML-aided power allocation for Tactical MIMO. In 2021 IEEE Military Communications Conference 273–278 (IEEE, 2021).Chowdhury, A., Verma, G., Rao, C., Swami, A. & Segarra, S. Unfolding WMMSE using graph neural networks for efficient power allocation. IEEE Trans. Wirel. Commun. 20, 6004–6017 (2021).Article 

Google Scholar 
Li, B., Verma, G. & Segarra, S. Graph-based algorithm unfolding for energy-aware power allocation in wireless networks. IEEE Trans. Wirel. Commun. 22, 1359–1373 (2022). This paper discusses the use of the algorithm unrolling framework to address the power allocation problem in the application of GNNs in wireless networks.Article 

Google Scholar 
Wang, Z., Eisen, M. & Ribeiro, A. Learning decentralized wireless resource allocations with graph neural networks. IEEE Trans. Signal. Process. 70, 1850–1863 (2022).Article 
MathSciNet 

Google Scholar 
Shen, Y., Zhang, J., Song, S. H. & Letaief, K. B. Graph neural networks for wireless communications: from theory to practice. IEEE Trans. Wirel. Commun. 22, 3554–3569 (2023).Article 

Google Scholar 
Owerko, D., Gama, F. & Ribeiro, A. Optimal power flow using graph neural networks. In 2020 IEEE International Conference on Acoustics, Speech and Signal Processing 5930–5934 (IEEE, 2020).Owerko, D., Gama, F. & Ribeiro, A. Predicting power outages using graph neural networks. In IEEE Global Conference on Signal and Information Processing 743–747 (IEEE, 2018).Donon, B. et al. Neural networks for power flow: graph neural solver. Electr. Power Syst. Res. 189, 106547 (2020).Article 

Google Scholar 
Ustun, E., Deng, C., Pal, D., Li, Z. & Zhang, Z. Accurate operation delay prediction for FPGA HLS using graph neural networks. In 2020 IEEE/ACM International Conference On Computer Aided Design (ICCAD) 1–9 (IEEE, 2020).Xie, Z. et al. Preplacement net length and timing estimation by customized graph neural network. IEEE Trans. Comput. Aided Des. Integr. Circuits Syst. 41, 4667–4680 (2022).Article 

Google Scholar 
Liu, M. et al. Parasitic-aware analog circuit sizing with graph neural networks and bayesian optimization. In 2021 Design, Automation & Test in Europe Conference & Exhibition (DATE) 1372–1377 (IEEE, 2021).Guo, Z. et al. A timing engine inspired graph neural network model for pre-routing slack prediction. In Proc. 59th ACM/IEEE Design Automation Conference 1207–1212 (ACM, 2022).Yang, Z. et al. Versatile multi-stage graph neural network for circuit representation. In Proc. 36th International Conference on Neural Information Processing Systems 20313–20324 (Curran Associates Inc., 2022).Shlomi, J., Battaglia, P. & Vlimant, J.-R. Graph neural networks in particle physics. Mach. Learn. Sci. Technol. 2, 021001 (2020). A timing engine inspired graph neural network model for pre-routing slack prediction.Article 

Google Scholar 
Duarte, J. & Vlimant, J.-R. Graph neural networks for particle tracking and reconstruction. In Artificial Intelligence for High Energy Physics 387 (World Scientific, 2022).DeZoort, G., Battaglia, P. W., Biscarat, C. & Vlimant, J.-R. Graph neural networks at the Large Hadron Collider. Nat. Rev. Phys. 5, 281 (2023).Article 

Google Scholar 
Fung, V., Zhang, J., Juarez, E. & Sumpter, B. G. Benchmarking graph neural networks for materials chemistry. npj Comput. Mater. 7, 84 (2021).Article 

Google Scholar 
Reiser, P. et al. Graph neural networks for materials science and chemistry. Commun. Mater. 3, 93 (2022).Article 

Google Scholar 
Baek, M. et al. Accurate prediction of protein structures and interactions using a three-track neural network. Science 373, 871–876 (2021).Article 

Google Scholar 
Dauparas, J. et al. Robust deep learning–based protein sequence design using ProteinMPNN. Science 378, 49–56 (2022).Article 

Google Scholar 
Stokes, J. M. et al. A deep learning approach to antibiotic discovery. Cell 180, 688–702 (2020).Article 

Google Scholar 
Hamilton, W., Ying, Z. & Leskovec, J. Inductive representation learning on large graphs. In Proc. 31st International Conference on Neural Informaton Processing Systems 1025–1035 (Curran Associates Inc., 2017).Veličković, P. et al. Graph attention networks. In International Conference on Learning Representations (ICLR, 2018).Battaglia, P. W. et al. Relational inductive biases, deep learning, and graph networks. Preprint at arXiv https://doi.org/10.48550/arXiv.1806.01261 (2018).Defferrard, M., Bresson, X. & Vandergheynst, P. Convolutional neural networks on graphs with fast localized spectral filtering. In Proc. 30th International Conference on Neural Information Processing Systems 3844–3852 (Curran Associates Inc., 2016).Bronstein, M. M., Bruna, J., LeCun, Y., Szlam, A. & Vandergheynst, P. Geometric deep learning: going beyond Euclidean data. IEEE Signal. Process. Mag. 34, 18–42 (2017).Article 

Google Scholar 
Chien, E., Peng, J., Li, P. & Milenkovic, O. Adaptive universal Generalized PageRank graph neural network. In International Conference on Learning Representations (ICLR, 2021).Wang, X. & Zhang, M. How powerful are spectral graph neural networks. In Proc. 39th International Conference on Machine Learning 23341–23362 (ICML, 2022).Corso, G., Cavalleri, L., Beaini, D., Liò, P. & Veličković, P. Principal neighbourhood aggregation for graph nets. In Proc. 34th International Conference on Neural Information Processing Systems 13260–13271 (Curran Associates Inc., 2020).Hornik, K., Stinchcombe, M. & White, H. Multilayer feedforward networks are universal approximators. Neural Netw. 2, 359–366 (1989). This fundamental work presents the architecture of GNNs, showing how it can be represented and implemented in a message passing process.Article 

Google Scholar 
Xu, K., Hu, W., Leskovec, J. & Jegelka, S. How powerful are graph neural networks? In International Conference on Learning Representations (ICLR, 2019).Morris, C. et al. Weisfeiler and Leman go neural: higher order graph neural networks. In Proc. 33rd AAAI Conference on Artificial Intelligence 4602–4609 (AAAI, 2019).Maron, H., Ben-Hamu, H., Shamir, N. & Lipman, Y. Invariant and equivariant graph networks. In International Conference on Learning Representations (ICLR, 2019).Li, P., Wang, Y., Wang, H. & Leskovec, J. Distance encoding: design provably more powerful neural networks for graph representation learning. In Proc. 34th International Conference on Neural Information Processing Systems 4465–4478 (Curran Associates Inc., 2020).Bouritsas, G., Frasca, F., Zafeiriou, S. & Bronstein, M. M. Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Trans. Pattern Anal. Mach. Intell. 45, 657–668 (2022).Article 

Google Scholar 
Kipf, T. N. & Welling, M. Semi-supervised classification with graph convolutional networks. In International Conference on Learning Representations (ICLR, 2017).Chen, D. et al. Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In Proc. AAAI Conference on Artificial Intelligence 3438–3445 (AAAI, 2020).Topping, J., Di Giovanni, F., Chamberlain, B. P., Dong, X. & Bronstein, M. M. Understanding over-squashing and bottlenecks on graphs via curvature. In International Conference on Learning Representations (ICLR, 2022).Alon, U. & Yahav, E. On the bottleneck of graph neural networks and its practical implications. In International Conference on Learning Representations (ICLR, 2021).Chen, K., Hu, J., Zhang, Y., Yu, Z. & He, J. Fault location in power distribution systems via deep graph convolutional networks. IEEE J. Sel. Areas Commun. 38, 119–131 (2019).Article 

Google Scholar 
de Freitas, J. T. & Coelho, F. G. F. Fault localization method for power distribution systems based on gated graph neural networks. Electr. Eng. 103, 2259–2266 (2021).Article 

Google Scholar 
Arjona Martínez, J., Cerri, O., Pierini, M., Spiropulu, M. & Vlimant, J.-R. Pileup mitigation at the Large Hadron Collider with graph neural networks. Eur. Phys. J. Plus 134, 333 (2019).Article 

Google Scholar 
Li, T. et al. Semi-supervised graph neural networks for pileup noise removal. Eur. Phys. J. C. 83, 99 (2023).Article 

Google Scholar 
Luo, Y. et al. A network integration approach for drug–target interaction prediction and computational drug repositioning from heterogeneous information. Nat. Commun. 8, 573 (2017).Article 

Google Scholar 
Yu, Z., Huang, F., Zhao, X., Xiao, W. & Zhang, W. Predicting drug–disease associations through layer attention graph convolutional network. Brief. Bioinform. 22, bbaa243 (2021).Article 

Google Scholar 
Farrell, S. et al. Novel deep learning methods for track reconstruction. In International Workshop Connecting The Dots (2018).Ju, X. et al. Performance of a geometric deep learning pipeline for HL-LHC particle tracking. Eur. Phys. J. C 81, 876 (2021).Article 

Google Scholar 
DeZoort, G. et al. Charged particle tracking via edgeclassifying interaction networks. Comput. Softw. Big Sci. 5, 26 (2021).Article 

Google Scholar 
Wu, N., Yang, H., Xie, Y., Li, P. & Hao, C. High-level synthesis performance prediction using GNNs: Benchmarking, modeling, and advancing. In Proc. 59th ACM/IEEE Design Automation Conference 49–54 (ACM, 2022).Schütt, K. et al. SchNet: a continuous-filter convolutional neural network for modeling quantum interactions. In Proc. 31st International Conference on Neural Information Processing Systems 992–1002 (Curran Associates Inc., 2017).Wu, Z. et al. MoleculeNet: a benchmark for molecular machine learning. Chem. Sci. 9, 513–530 (2018).Article 

Google Scholar 
Xie, T. & Grossman, J. C. Crystal graph convolutional neural networks for an accurate and interpretable prediction of material properties. Phys. Rev. Lett. 120, 145301 (2018).Article 

Google Scholar 
Qu, H. & Gouskos, L. ParticleNet: jet tagging via particle clouds. Phys. Rev. D 101, 056019 (2020).Article 

Google Scholar 
Guo, J., Li, J., Li, T. & Zhang, R. Boosted Higgs Boson jet reconstruction via a graph neural network. Phys. Rev. D 103, 116025 (2021).Article 

Google Scholar 
Eisen, M. & Ribeiro, A. Optimal wireless resource allocation with random edge graph neural networks. IEEE Trans. Signal. Process. 68, 2977–2991 (2020).Article 
MathSciNet 

Google Scholar 
Owerko, D., Gama, F. & Ribeiro, A. Unsupervised optimal power flow using graph neural networks. In 2024 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 6885–6889 (IEEE, 2024).Nachmani, E. & Wolf, L. Hyper-graph-network decoders for block codes. In Proc. 33rd International Conference on Neural Information Processing Systems 2329–2339 (Curran Associates Inc., 2019).Cammerer, S., Hoydis, J., Aoudia, F. A. & Keller, A. Graph neural networks for channel decoding. In 2022 IEEE Globecom Workshops 486–491 (IEEE, 2022).Chen, T. et al. Learning to optimize: a primer and a benchmark. J. Mach. Learn. Res. 23, 8562–8620 (2022).MathSciNet 

Google Scholar 
Monga, V., Li, Y. & Eldar, Y. C. Algorithm unrolling: interpretable, efficient deep learning for signal and image processing. IEEE Signal. Process. Mag. 38, 18–44 (2021).Article 

Google Scholar 
Zhao, Z., Verma, G., Rao, C., Swami, A. & Segarra, S. Link scheduling using graph neural networks. IEEE Trans. Wirel. Commun. 22, 3997–4012 (2022).Article 

Google Scholar 
Zhao, Z., Verma, G., Swami, A. & Segarra, S. Delay-oriented distributed scheduling using graph neural networks. In 2022 IEEE International Conference on Acoustics, Speech and Signal Processing 8902–8906 (IEEE, 2022).Zhao, Z., Verma, G., Rao, C., Swami, A. & Segarra, S. Distributed scheduling using graph neural networks. In 2021 IEEE International Conference on Acoustics, Speech and Signal Processing 4720–4724 (IEEE, 2021).Kahng, A. B., Lienig, J., Markov, I. L. & Hu, J. VLSI Physical Design: From Graph Partitioning to Timing Closure 312 (Springer, 2011).Callister Jr, W. D. & Rethwisch, D. G. Fundamentals of Materials Science and Engineering: An Integrated Approach (Wiley, 2020).Erdős, P. & Rényi, A. On random graphs I. Publ. Math. Debr. 6, 290–297 (1959).Article 

Google Scholar 
Makhzani, A., Shlens, J., Jaitly, N., Goodfellow, I. & Frey, B. Adversarial autoencoders. Preprint at arXiv https://doi.org/10.48550/arXiv.1511.05644 (2015).Xu, M. et al. Geodiff: a geometric diffusion model for molecular conformation generation. In International Conference on Learning Representations (ICLR, 2022).Vignac, C. et al. Digress: discrete denoising diffusion for graph generation. In International Conference on Learning Representations (ICLR, 2023).Mercado, R. et al. Graph networks for molecular design. Mach. Learn. Sci. Technol. 2, 025023 (2021).Article 

Google Scholar 
Bilodeau, C., Jin, W., Jaakkola, T., Barzilay, R. & Jensen, K. F. Generative models for molecular discovery: recent advances and challenges. Wiley Interdiscip. Rev. Comput. Mol. Sci. 12, e1608 (2022).Article 

Google Scholar 
Jin, W., Barzilay, R. & Jaakkola, T. Junction tree variational autoencoder for molecular graph generation. In Proc. 35th International Conference on Machine Learning 2323–2332 (ICML, 2018).Mirhoseini, A. et al. A graph placement methodology for fast chip design. Nature 594, 207–212 (2021).Article 

Google Scholar 
Cheng, R. et al. The policy-gradient placement and generative routing neural networks for chip design. In Proc. 36th International Conference on Neural Information Processing Systems 26350–26362 (Curran Associates Inc., 2022).Chen, T., Zhang, G. L., Yu, B., Li, B. & Schlichtmann, U. Machine learning in advanced IC design: a methodological survey. IEEE Des. Test 40, 17–33 (2022). This review covers the integration of deep learning tools with conventional optimization algorithm frameworks to enhance the resolution of signal and image processing tasks through data-driven approaches.Article 

Google Scholar 
Sánchez, D., Servadei, L., Kiprit, G. N., Wille, R. & Ecker, W. A comprehensive survey on electronic design automation and graph neural networks: theory and applications. ACM Trans. Des. Autom. Electron. Syst. 28, 1–27 (2023).Article 

Google Scholar 
Zhang, J. et al. Fine-grained service offloading in B5G/6G collaborative edge computing based on graph neural networks. In IEEE International Conference on Communications 5226–5231 (IEEE, 2022).Ma, Y., He, Z., Li, W., Zhang, L. & Yu, B. Understanding graphs in EDA: from shallow to deep learning. In Proc. 2020 International Symposium on Physical Design 119–126 (ACM, 2020).Agnesina, A., Chang, K. & Lim, S. K. VLSI placement parameter optimization using deep reinforcement learning. In 2020 IEEE/ACM International Conference On Computer Aided Design (ICCAD) (IEEE, 2020).Lu, Y.-C., Pentapati, S. & Lim, S. K. The law of attraction: Affinity-aware placement optimization using graph neural networks. In Proc. 2021 International Symposium on Physical Design 7–14 (ACM, 2021).Lu, Y.-C., Siddhartha, N., Khandelwal, V. & Lim, S. K. Doomed run prediction in physical design by exploiting sequential flow and graph learning. In 2021 IEEE/ACM International Conference On Computer Aided Design (ICCAD) 1–9 (IEEE, 2021).Kirby, R., Godil, S., Roy, R. & Catanzaro, B. CongestionNet: routing congestion prediction using deep graph neural networks. In 27th International Conference on Very Large Scale Integration (VLSI-SoC) 217–222 (IEEE, 2019).Maji, S., Budak, A. F., Poddar, S. & Pan, D. Z. Toward end-to-end analog design automation with ML and data-driven approaches. In Proc. 29th Asia and South Pacific Design Automation Conference 657–664 (IEEE, 2024).Zhu, K., Chen, H., Liu, M. & Pan, D. Z. Tutorial and perspectives on MAGICAL: a silicon-proven opensource analog IC layout system. IEEE Trans. Circuits Syst. II: Express Br. 70, 715–720 (2023).
Google Scholar 
Kunal, K. et al. ALIGN: Open-source analog layout automation from the ground up. In Proc. 56th Annual Design Automation Conference 2019 1–4 (ACM, 2019).Wang, H. et al. GCN-RL circuit designer: transferable transistor sizing with graph neural networks and reinforcement learning. In 57th ACM/EDAC/IEEE Design Automation Conference 1–6 (IEEE, 2020).Dong, Z. et al. CktGNN: circuit graph neural network for electronic design automation. In International Conference on Learning Representations (ICLR, 2023).Zhang, G., He, H. & Katabi, D. Circuit-GNN: graph neural networks for distributed circuit design. In Proc. 36th International Conference on Machine Learning 7364–7373 (ICML, 2019).Ren, H., Kokai, G. F., Turner, W. J. & Ku, T.-S. ParaGraph: layout parasitics and device parameter prediction using graph neural networks. In 2020 57th ACM/IEEE Design Automation Conference (DAC) 1–6 (IEEE, 2020).Li, Y. et al. A customized graph neural network model for guiding analog IC placement. In Proc. 39th International Conference on Computer-Aided Design 1–9 (ACM, 2020). This groundbreaking work discusses the application of GNNS and RL to EDA, solving the global placement problem in chip design and outperforming the state-of-the-art method for this task.Chen, H. et al. Universal symmetry constraint extraction for analog and mixed-signal circuits with graph neural networks. In 2021 58th ACM/IEEE Design Automation Conference (DAC) 1243–1248 (IEEE, 2021).Cao, W., Benosman, M., Zhang, X. & Ma, R. Domain knowledge-infused deep learning for automated analog/radio-frequency circuit parameter optimization. In 59th ACM/IEEE Design Automation Conference 1015–1020 (ACM, 2022).Shi, W. et al. RobustAnalog: fast variation-aware analog circuit design via multi-task RL. In Proc. 2022 ACM/IEEE Workshop on Machine Learning for CAD 35–41 (ACM, 2022).Luo, Z.-Q. & Zhang, S. Dynamic spectrum management: complexity and duality. IEEE J. Sel. Top. Signal. Process. 2, 57–73 (2008).Article 

Google Scholar 
Chowdhury, A., Verma, G., Swami, A. & Segarra, S. Deep graph unfolding for beamforming in MU-MIMO interference networks. IEEE Trans. Wirel. Commun. 23, 4889–4903 (2023).Article 

Google Scholar 
Shi, Q., Razaviyayn, M., Luo, Z.-Q. & He, C. An iteratively weighted MMSE approach to distributed sumutility maximization for a MIMO interfering broadcast channel. IEEE Trans. Signal. Process. 59, 4331–4340 (2011).Article 
MathSciNet 

Google Scholar 
Tassiulas, L. & Ephremides, L. Stability properties of constrained queueing systems and scheduling policies for maximum throughput in multihop radio networks. IEEE Trans. Autom. Control. 37, 1936–1948 (1992).Article 
MathSciNet 

Google Scholar 
Joo, C., Sharma, G., Shroff, N. B. & Mazumdar, R. R. On the complexity of scheduling in wireless networks. Eurasip J. Wirel. Commun. Netw. 2010, 418934 (2010).Article 

Google Scholar 
Dimakis, A. & Walrand, J. Sufficient conditions for stability of longest-queue-first scheduling: second-order properties using fluid limits. Adv. Appl. Probab. 38, 505–521 (2006).Article 
MathSciNet 

Google Scholar 
Joo, C. & Shroff, N. B. Local greedy approximation for scheduling in multihop wireless networks. IEEE Trans. Mob. Comput. 11, 414–426 (2012).Article 

Google Scholar 
Gurobi Optimization. Gurobi optimizer reference manual. Gurobi https://www.gurobi.com/wp-content/plugins/hd_documentations/documentation/9.0/refman.pdf (2020).Paschalidis, I. C., Huang, F. & Lai, W. A message-passing algorithm for wireless network scheduling. IEEE/ACM Trans. Netw. 23, 1528–1541 (2015).Article 

Google Scholar 
Zhao, Z., Swami, A. & Segarra, S. Graph-based deterministic policy gradient for repetitive combinatorial optimization problems. In International Conference on Learning Representations (ICLR, 2023).Zhao, Z., Radojicic, B., Verma, G., Swami, A. & Segarra, S. Delay-aware backpressure routing using graph neural networks. In 2023 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 4720–4724 (IEEE, 2023).Rusek, K., Suárez-Varela, J., Almasan, P., Barlet-Ros, P. & Cabellos-Aparicio, A. RouteNet: leveraging graph neural networks for network modeling and optimization in SDN. IEEE J. Sel. Areas Commun. 38, 2260–2270 (2020).Article 

Google Scholar 
Li, B. et al. Learnable digital twin for efficient wireless network evaluation. In 2023 IEEE Military Communications Conference (MILCOM) 661–666 (IEEE, 2023).Deiana, A. M. et al. Applications and techniques for fast machine learning in science. Front. Big Data 5, 787421 (2022).Article 

Google Scholar 
Sirunyan, A. M. et al. Particle-flow reconstruction and global event description with the CMS detector. J. Instrum. 12, P10003 (2017).Article 

Google Scholar 
Pata, J., Duarte, J., Vlimant, J.-R., Pierini, M. & Spiropulu, M. MLPF: efficient machine-learned particle-flow reconstruction using graph neural networks. Eur. Phys. J. C 81, 381 (2021).Article 

Google Scholar 
Kieseler, J. Object condensation: one-stage grid-free multi-object reconstruction in physics detectors, graph and image data. Eur. Phys. J. C 80, 886 (2020).Article 

Google Scholar 
Di Bello, F. A. et al. Reconstructing particles in jets using set transformer and hypergraph prediction networks. Eur. Phys. J. C 83, 596 (2023).Article 

Google Scholar 
Pata, J. et al. Scalable neural network models and terascale datasets for particle-flow reconstruction. Preprint at arXiv https://doi.org/10.21203/rs.3.rs-3466159/v1 (2023).Sirunyan, A. M. et al. Pileup mitigation at CMS in 13 TeV data. J. Instrum. 15, P09018 (2020).Article 

Google Scholar 
Strandlie, A. & Frühwirth, R. Track and vertex reconstruction: from classical to adaptive methods. Rev. Mod. Phys. 82, 1419 (2010).Article 

Google Scholar 
Chatrchyan, S. et al. Description and performance of track and primary-vertex reconstruction with the CMS tracker. J. Instrum. 9, P10009 (2014).Article 

Google Scholar 
Elabd, A. et al. Graph neural networks for charged particle tracking on FPGAs. Front. Big Data 5, 828666 (2022).Article 

Google Scholar 
Huang, S.-Y. et al. Low latency edge classification GNN for particle trajectory tracking on FPGAs. In 2023 33rd International Conference on Field-Programmable Logic and Applications (FPL) 294–298 (IEEE, 2023).Duarte, J. et al. Fast inference of deep neural networks in FPGAs for particle physics. J. Instrum. 13, P07027 (2018).Article 

Google Scholar 
FastML Team. fastmachinelearning/hls4ml. Github https://github.com/fastmachinelearning/hls4ml (2023).Xuan, T. et al. Trigger detection for the sPHENIX experiment via bipartite graph networks with set transformer. In Machine Learning and Knowledge Discovery in Databases 51–67 (Springer, 2023).Moreno, E. A. et al. JEDI-net: a jet identification algorithm based on interaction networks. Eur. Phys. J. C 80, 58 (2020).Article 

Google Scholar 
Mikuni, V., Nachman, B. & Shih, D. Online-compatible unsupervised non-resonant anomaly detection. Phys. Rev. D 105, 055006 (2022).Article 

Google Scholar 
Que, Z. et al. LL-GNN: Low latency graph neural networks on FPGAs for high energy physics. In ACM Transactions on Embedded Computing Systems 1–28 (ACM, 2024). This extensive review discusses the integration of powerful machine learning methods into a real-time experimental data processing loop to accelerate the scientific discovery.Duarte, J. et al. FPGA-accelerated machine learning inference as a service for particle physics computing. Comput. Softw. Big Sci. 3, 13 (2019).Article 

Google Scholar 
Krupa, J. et al. GPU coprocessors as a service for deep learning inference in high energy physics. Mach. Learn. Sci. Technol. 2, 035005 (2021).Article 

Google Scholar 
Bogatskiy, A. et al. Lorentz group equivariant neural network for particle physics. In Proc. 37th International Conference on Machine Learning 992–1002 (ICML, 2020).Gong, S. et al. An efficient Lorentz equivariant graph neural network for jet tagging. J. High Energy Phys. 7, 030 (2022).Article 
MathSciNet 

Google Scholar 
Tsan, S. et al. Particle graph autoencoders and differentiable, learned energy mover’s distance. In Advances in Neural Information Processing Systems (NIPS, 2021).Atkinson, O., Bhardwaj, A., Englert, C., Ngairangbam, V. S. & Spannowsky, M. Anomaly detection with convolutional graph neural networks. J. High Energy Phys. 8, 080 (2021).Article 

Google Scholar 
Hao, Z., Kansal, R., Duarte, J. & Chernyavskaya, N. Lorentz group equivariant autoencoders. Eur. Phys. J. C 83, 485 (2023).Article 

Google Scholar 
Govorkova, E. et al. Autoencoders on field-programmable gate arrays for real-time, unsupervised new physics detection at 40 MHz at the Large Hadron Collider. Nat. Mach. Intell. 4, 154–161 (2022).Article 

Google Scholar 
Gong, W. & Yan, Q. Graph-based deep learning frameworks for molecules and solid-state materials. Comput. Mater. Sci. 195, 110332 (2021).Article 

Google Scholar 
Bapst, V. et al. Unveiling the predictive power of static structure in glassy systems. Nat. Phys. 16, 448–454 (2020). This fundamental work demonstrates the potential of FPGA-implemented deep learning models for achieving ultra-high inference efficiency in particle physics.Article 

Google Scholar 
Chen, C., Zuo, Y., Ye, W., Li, X. & Ong, S. P. Learning properties of ordered and disordered materials from multi-fidelity data. Nat. Comput. Sci. 1, 46–53 (2021).Article 

Google Scholar 
Jang, J., Gu, G. H., Noh, J., Kim, J. & Jung, Y. Structure-based synthesizability prediction of crystals using partially supervised learning. J. Am. Chem. Soc. 142, 18836–18843 (2020).Article 

Google Scholar 
Chen, C., Ye, W., Zuo, Y., Zheng, C. & Ong, S. P. Graph networks as a universal machine learning framework for molecules and crystals. Chem. Mater. 31, 3564–3572 (2019).Article 

Google Scholar 
Gasteiger, J., Groß, J. & Günnemann, S. Directional message passing for molecular graphs. In International Conference on Learning Representations (ICLR, 2020).Choudhary, K. & DeCost, B. Atomistic line graph neural network for improved materials property predictions. npj Comput. Mater. 7, 185 (2021).Article 

Google Scholar 
Chen, C. & Ong, S. P. A universal graph deep learning interatomic potential for the periodic table. Nat. Comput. Sci. 2, 718–728 (2022).Article 

Google Scholar 
Batzner, S. et al. E(3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nat. Commun. 13, 2453 (2022).Article 

Google Scholar 
Schütt, K., Unke, O. & Gastegger, M. Equivariant message passing for the prediction of tensorial properties and molecular spectra. In Proc. 38th International Conference on Machine Learning 9377–9388 (ICML, 2021).Thölke, P. & De Fabritiis, G. TorchMD-NET: Equivariant transformers for neural network based molecular potentials. In International Conferenc on Learning Representations (ICLR, 2022).Liao, Y.-L. & Smidt, T. Equiformer: Equivariant graph attention transformer for 3D atomistic graphs. In International Conference on Learning Representations (ICLR, 2023).Unke, O. T. et al. Machine learning force fields. Chem. Rev. 121, 10142–10186 (2021).Article 

Google Scholar 
Musil, F. et al. Physics-inspired structural representations for molecules and materials. Chem. Rev. 121, 9759–9815 (2021).Article 

Google Scholar 
Choudhary, K. et al. Unified graph neural network force-field for the periodic table: solid state applications. Digit. Discov. 2, 346–355 (2023).Article 

Google Scholar 
Zunger, A. Inverse design in search of materials with target functionalities. Nat. Rev. Chem. 2, 0121 (2018).Article 

Google Scholar 
Gebauer, N., Gastegger, M. & Schütt, K. Symmetry adapted generation of 3D point sets for the targeted discovery of molecules. In Proc. 33rd International Conference on Neural Information Processing Systems 7566–7578 (Curran Associates Inc., 2019).Xie, T., Fu, X., Ganea, O.-E., Barzilay, R. & Jaakkola, T. Crystal diffusion variational autoencoder for periodic material generation. In International Conference on Learning Representations (ICLR, 2022).Lyngby, P. & Thygesen, K. S. Data-driven discovery of 2D materials by deep generative models. npj Comput. Mater. 8, 232 (2022).Article 

Google Scholar 
Wines, D., Xie, T. & Choudhary, K. Inverse design of next-generation superconductors using data-driven deep generative models. J. Phys. Chem. Lett. 14, 6630–6638 (2023).Article 

Google Scholar 
Chanussot, L. et al. Open Catalyst 2020 (OC20) dataset and community challenges. ACS Catal. 11, 6059–6072 (2021).Article 

Google Scholar 
Gene Ontology Consortium. The Gene Ontology resource: 20 years and still going strong. Nucleic Acids Res. 47, D330–D338 (2019).Article 

Google Scholar 
Jumper, J. et al. Highly accurate protein structure prediction with AlphaFold. Nature 596, 583–589 (2021).Article 

Google Scholar 
Kryshtafovych, A., Schwede, T., Topf, M., Fidelis, K. & Moult, J. Critical assessment of methods of protein structure prediction (CASP) — Round XIV. Proteins: Struct. Funct. Genet. 89, 1607–1617 (2021).Article 

Google Scholar 
Ingraham, J., Garg, V., Barzilay, R. & Jaakkola, T. Generative models for graph-based protein design. In Proc. 33rd International Conference on Neural Information Processing Systems 15820–15831 (Curran Associates Inc., 2019).Luo, J. & Luo, Y. Contrastive learning of protein representations with graph neural networks for structural and functional annotations. Pac. Symp. Biocomput. 2023, 109–120 (2023).
Google Scholar 
Gelman, S., Fahlberg, S. A., Heinzelman, P., Romero, P. A. & Gitter, A. Neural networks to learn protein sequence–function relationships from deep mutational scanning data. Proc. Natl Acad. Sci. USA 118, e2104878118 (2021).Article 

Google Scholar 
Chen, T. et al. HotProtein: A novel framework for protein thermostability prediction and editing. In International Conference on Learning Representations (ICLR, 2022).Gao, Z. et al. Hierarchical graph learning for protein–protein interaction. Nat. Commun. 14, 1093 (2023).Article 

Google Scholar 
Lu, W. et al. TANKbind: Trigonometry-aware neural networks for drug–protein binding structure prediction. In Proc. 36th International Conference on Neural Information Processing Systems 7236–7249 (Curran Associates Inc., 2022).Gainza, P. et al. De novo design of protein interactions with learned surface fingerprints. Nature 617, 176–184 (2023).Article 

Google Scholar 
Ho, J., Jain, A. & Abbeel, P. Denoising diffusion probabilistic models. In Proc. 34th International Conference on Neural Information Processing Systems 6840–6851 (Curran Associates Inc., 2020).Watson, J. L. et al. De novo design of protein structure and function with rfdiffusion. Nature 1–3 (2023).Stärk, H., Ganea, O., Pattanaik, L., Barzilay, R. & Jaakkola, T. EquiBind: Geometric deep learning for drug binding structure prediction. In Proc. 39th International Conference on Machine Learning 20503–20521 (ICML, 2022).Qian, W. W. et al. Metabolic activity organizes olfactory representations. eLife 12 (2023).Morselli Gysi, D. et al. Network medicine framework for identifying drug-repurposing opportunities for COVID-19. Proc. Natl Acad. Sci. USA 118, e2025581118 (2021).Article 

Google Scholar 
Li, S. et al. MONN: a multi-objective neural network for predicting compound–protein interactions and affinities. Cell Syst 10, 308–322 (2020).Article 

Google Scholar 
Zitnik, M., Agrawal, M. & Leskovec, J. Modeling polypharmacy side effects with graph convolutional networks. Bioinformatics 34, i457–i466 (2018).Article 

Google Scholar 
Satorras, V. G., Hoogeboom, E. & Welling, M. E(n) equivariant graph neural networks. In Proc. 38th International Conference on Machine Learning 9323–9332 (ICML, 2021).Townshend, R. J. et al. ATOM3D: Tasks on molecules in three dimensions. In 35th Conference on Neural Information Processing Systems (NIPS, 2021).Hoogeboom, E., Satorras, V. G., Vignac, C. & Welling, M. Equivariant diffusion for molecule generation in 3D. In Proc. 39th International Conference on Machine Learning 8867–8887 (ICML, 2022).Guan, J. et al. DecompDiff: Diffusion models with decomposed priors for structure-based drug design. In Proc. 40th International Conference on Machine Learning 11827–11846 (ICML, 2023). This article discusses the application of graph learning models to biology, presenting unprecedentedly high accuracy in predicting protein structures.Luo, S., Guan, J., Ma, J. & Peng, J. A 3D generative model for structure-based drug design. In Proc. 35th International Conference on Neural Information Processing Systems 6229–6239 (Curran Associates Inc., 2021).Liu, M., Luo, Y., Uchino, K., Maruhashi, K. & Ji, S. Generating 3D molecules for target protein binding. In Proc. 39th International Conference on Machine Learning 13912–13924 (ICML, 2022).Peng, X. et al. Pocket2Mol: Efficient molecular sampling based on 3D protein pockets. In Proc. 39th International Conference on Machine Learning 17644–17655 (ICML, 2022).Guan, J. et al. 3D equivariant diffusion for target-aware molecule generation and affinity prediction. In International Conference on Learning Representations (ICLR, 2023).Wang, J. et al. SCGNN is a novel graph neural network framework for single-cell RNA-seq analyses. Nat. Commun. 12, 1882 (2021).Article 

Google Scholar 
Li, H. et al. Inferring transcription factor regulatory networks from single-cell ATAC-seq data based on graph neural networks. Nat. Mach. Intell. 4, 389–400 (2022).Article 

Google Scholar 
Cheng, F. et al. Network-based approach to prediction and population-based validation of in silico drug repurposing. Nat. Commun. 9, 2691 (2018).Article 

Google Scholar 
Cheng, F., Kovács, I. A. & Barabási, A.-L. Network-based prediction of drug combinations. Nat. Commun. 10, 1197 (2019).Article 

Google Scholar 
Jin, W. et al. Deep learning identifies synergistic drug combinations for treating COVID-19. Proc. Natl Acad. Sci. USA 118, e2105070118 (2021).Article 

Google Scholar 
Ge, Y. et al. An integrative drug repositioning framework discovered a potential therapeutic agent targeting COVID19. Signal. Transduct. Target. Ther. 6, 165 (2021).Article 

Google Scholar 
Zhou, Y. et al. CGC-Net: cell graph convolutional network for grading of colorectal cancer histology images. In 2019 IEEE/CVF International Conference on Computer Vision Workshop (ICCVW) 388–398 (IEEE, 2019).Wu, Z. et al. Graph deep learning for the characterization of tumour microenvironments from spatial protein profiles in tissue specimens. Nat. Biomed. Eng. 6, 1435–1448 (2022).Article 

Google Scholar 
Liu, Z., Li, X., Peng, H., He, L. & Philip, S. Y. Heterogeneous similarity graph neural network on electronic health records. In 2020 IEEE International Conference on Big Data 1196–1205 (IEEE, 2020).Choi, E. et al. Learning the graphical structure of electronic health records with graph convolutional transformer. In Proc. 34th AAAI Conference on Artificial Intelligence 606–613 (AAAI, 2020).Fey, M. & Lenssen, J. E. Fast graph representation learning with pytorch geometric. In ICLR 2019 Workshop on Representation Learning on Graphs and Manifolds (ICLR, 2019).Wang, M. et al. Deep graph library: a graph-centric, highly-performant package for graph neural networks. Preprint at arXiv https://doi.org/10.48550/arXiv.1909.01315 (2019).Sarkar, R., Abi-Karam, S., He, Y., Sathidevi, L. & Hao, C. FlowGNN: A dataflow architecture for real-time workload-agnostic graph neural network inference. In 2023 IEEE International Symposium on High-Performance Computer Architecture (HPCA) 1099–1112 (IEEE, 2023).Huang, G. et al. Machine learning for electronic design automation: a survey. ACM Trans. Des. Autom. Electron. Syst. 26, 1–46 (2021).Article 

Google Scholar 
He, Z., Wang, Z., Bail, C., Yang, H. & Yu, B. Graph learning-based arithmetic block identification. In 2021 IEEE/ACM International Conference On Computer Aided Design (ICCAD) 1–8 (IEEE, 2021).He, S. et al. An overview on the application of graph neural networks in wireless networks. IEEE Open. J. Commun. Soc. 2, 2547–2565 (2021).Article 

Google Scholar 
Zitnik, M., Sosič, R. & Leskovec, J. Prioritizing network communities. Nat. Commun. 9, 2544 (2018).Article 

Google Scholar 
Hu, W. et al. Strategies for pre-training graph neural networks. In International Conference on Learning Representations (ICLR, 2020).Tishby, N., Pereira, F. C. & Bialek, W. The information bottleneck method. In Proc. 37th Annual Allerton Conference on Communication, Control and Computing 368–377 (1999).Miao, S., Liu, M. & Li, P. Interpretable and generalizable graph learning via stochastic attention mechanism. In Proc. 39th International Conference on Machine Learning 15524–15543 (ICML, 2022).Iiyama, Y. et al. Distance-weighted graph neural networks on FPGAs for real-time particle reconstruction in high energy physics. Front. Big Data 3, 598927 (2021).Article 

Google Scholar 
Wu, H. & Wang, H. Decoding latency of LDPC codes in 5G NR. In 2019 29th International Telecommunication Networks and Applications Conference (ITNAC) 1–5 (IEEE, 2019).Wang, Z. et al. GNN-PIM: A processing-in-memory architecture for graph neural networks. In Conference on Advanced Computer Architecture 73–86 (Springer, 2020).Huang, Y. et al. Accelerating graph convolutional networks using crossbar-based processing-in-memory architectures. In 2022 IEEE International Symposium on High-Performance Computer Architecture (HPCA) 1029–1042 (IEEE, 2022).Liang, S. et al. EnGN: a high-throughput and energy efficient accelerator for large graph neural networks. IEEE Trans. Comput. 70, 1511–1525 (2020).Article 

Google Scholar 
Choi, E., Bahadori, M. T., Song, L., Stewart, W. F. & Sun, J. GRAM: Graph-based attention model for healthcare representation learning. In Proc. 23rd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining 787–795 (ACM, 2017).Sajadmanesh, S., Shamsabadi, A. S., Bellet, A. & Gatica Perez, D. GAP: Differentially private graph neural networks with aggregation perturbation. In Proc. 32nd USENIX Conference on Security Symposium 3223–3240 (USENIX Association, 2023).Chien, E. et al. Differentially private decoupled graph convolutions for multigranular topology protection. In Proc. 37th International Conference on Neural Information Processing Systems 45381–45401 (Curran Associates Inc., 2023).Cao, Y. & Yang, J. Towards making systems forget with machine unlearning. In 2015 IEEE Symposium on Security and Privacy 463–480 (IEEE, 2015).Chien, E., Wang, H. P., Chen, Z. & Li, P. Langevin unlearning. In Privacy Regulation and Protection in Machine Learning Workshop (ICLR, 2024).Chien, E., Pan, C. & Milenkovic, O. Efficient model updates for approximate unlearning of graph-structured data. In International Conference on Learning Representations (ICLR, 2023).Mironov, I. Rényi differential privacy. In 2017 IEEE 30th Computer Security Foundations Symposium (CSF) 263–275 (IEEE, 2017).

Hot Topics

Related Articles