New memristor training method slashes AI energy use by six orders of magnitude

In a Nature Communications research study, scientists from China have actually established an error-aware probabilistic upgrade (EaPU) technique that lines up memristor equipment’s loud updates with semantic network training, lowering power usage by almost 6 orders of size versus GPUs while increasing precision on vision jobs. The research study confirms EaPU on 180 nm memristor varieties and large simulations.

发布者:Dr.Durant,转转请注明出处:https://robotalks.cn/new-memristor-training-method-slashes-ai-energy-use-by-six-orders-of-magnitude/

(0)
上一篇 18 1 月, 2026 7:18 下午
下一篇 18 1 月, 2026 8:18 下午

相关推荐

发表回复

您的邮箱地址不会被公开。 必填项已用 * 标注

联系我们

400-800-8888

在线咨询: QQ交谈

邮件:admin@example.com

工作时间:周一至周五,9:30-18:30,节假日休息

关注微信
社群的价值在于通过分享与互动,让想法产生更多想法,创新激发更多创新。