Novel out-of-core mechanism introduced for large-scale graph neural network training

A research team has introduced a new out-of-core mechanism, Capsule, for large-scale GNN training, which can achieve up to a 12.02× improvement in runtime efficiency, while using only 22.24% of the main memory, compared to SOTA out-of-core GNN systems. This work was published in the Proceedings of the ACM on Management of Data .The team included the Data Darkness Lab (DDL) at the Medical Imaging Intelligence and Robotics Research Center of the University of Science and Technology of China (USTC) Suzhou Institute.

发布者:Dr.Durant,转转请注明出处:https://robotalks.cn/novel-out-of-core-mechanism-introduced-for-large-scale-graph-neural-network-training/

(0)
上一篇 24 4 月, 2025 9:18 下午
下一篇 24 4 月, 2025 9:19 下午

相关推荐

发表回复

您的邮箱地址不会被公开。 必填项已用 * 标注

联系我们

400-800-8888

在线咨询: QQ交谈

邮件:admin@example.com

工作时间:周一至周五,9:30-18:30,节假日休息

关注微信
社群的价值在于通过分享与互动,让想法产生更多想法,创新激发更多创新。