Spectral Barron space and deep neural network approximation
数学专题报告
报告题目(Title):Spectral Barron space and deep neural network approximation
报告人(Speaker):廖钰蕾(新加坡国立大学)
地点(Place): 后主楼1124
时间(Time):2025年7月21日(周一)10:00-11:30
邀请人(Inviter):陈华杰
报告摘要
This work explores the neural network approximation capabilities for functions within the spectral Barron space B^s, where s is the smoothness index. We give a comprehensive study of the spectral Barron space and prove the sharp embedding between the spectral Barron space and the Besov space with embedding constants independent of the input dimension. Given the spectral Barron space as the target function space, we demonstrate that for functions in B^{1/2}, a shallow neural network (a single hidden layer) with N units can achieve an L^p-approximation rate of O(N^{-1/2}). This rate also applies to uniform approximation, differing by at most a logarithmic factor. Our results significantly reduce the smoothness requirement compared to existing theory, which necessitate functions to belong to B^1 in order to attain the same rate. Furthermore, we show that increasing the network's depth can notably improve the approximation order for functions with small smoothness. Specifically, for networks with L hidden layers, functions in B^s with 0 < sL \le 1/2 can achieve an approximation rate of O(N^{-sL}). The rates and prefactors in our estimates are dimension-free. We also confirm the sharpness of our findings, with the lower bound closely aligning with the upper, with a discrepancy of at most one logarithmic factor. This is a joint work with Prof. Pingbing Ming and Hao Yu.