学术活动

Differentiable Programming Tensor Networks

作者:点击次数:更新时间:2021年06月07日

报告题目:Differentiable Programming Tensor Networks

廖海军  副研究员 (中国科学院物理研究所)

:杨丽平

   间:202168号(星期二)上午10:00

   点:物理学院LE201


Abstract : Differentiable programming is a fresh programming paradigm which composes parameterized algorithmic components and optimizes them using gradient search. The concept emerges from deep learning but is not limited to training neural networks. We present the theory and practice of programming tensor network algorithms in a fully differentiable way. By formulating the tensor network algorithm as a computation graph, one can compute higher-order derivatives of the program accurately and efficiently using automatic differentiation. We present essential techniques to differentiate through the tensor networks contraction algorithms, including numerical stable differentiation for tensor decompositions and efficient backpropagation through fixed-point iterations. Differentiable programming removes laborious human efforts in deriving and implementing analytical gradients for tensor network programs, which opens the door to more innovations in tensor network algorithms and applications.

Ref:  Phys. Rev. X 9, 031041 (2019)


Brief CV: 廖海军,中国科学院物理研究所副研究员,博士生导师。2005年至2014年在中国人民大学学习,获得学士、博士学位。2014-2017年,在中国科学院物理研究所向涛院士课题组从事博士后研究工作,2021年入选中科院青促会会员。主要研究兴趣是发展新的张量重正化群方法,并运用它研究阻挫磁性体系、高温超导等强关联系统,以及经典统计模型等。至今在PRLPRX等国际一流期刊发表论文多篇。