Novel out-of-core mechanism introduced for large-scale graph neural network training

A research team has introduced a new out-of-core mechanism, Capsule, for large-scale GNN training, which can achieve up to a 12.02× improvement in runtime efficiency, while using only 22.24% of the main memory, compared to SOTA out-of-core GNN systems. This work was published in the Proceedings of the ACM on Management of Data .The team included the Data Darkness Lab (DDL) at the Medical Imaging Intelligence and Robotics Research Center of the University of Science and Technology of China (USTC) Suzhou Institute.

This post was originally published on this site

Skip The Dishes Referral Code

Lawyers Lookup - LawyersLookup.ca