Journal of the Operations Research Society of China ›› 2025, Vol. 13 ›› Issue (3): 837-872.doi: 10.1007/s40305-025-00609-9

Previous Articles    

Improved Convergence Rate of Nested Simulation with LSE on Sieve

Ruo-xue Liu1, Liang Ding2, Wen-jia Wang3, Lu Zou4   

  1. 1 Division of Emerging Interdisciplinary Areas, Academy of Interdisciplinary Studies, The Hong Kong University of Science and Technology, Hong Kong 999077, China;
    2 School of Data Science, Fudan University, Shanghai 200437, China;
    3 Data Science and Analytics Thrust, Information Hub, The Hong Kong University of Science and Technology (Guangzhou), Guangzhou 511453, Guangdong, China;
    4 School of Management, Shenzhen Polytechnic University, Shenzhen 518055, Guangdong, China
  • Received:2023-10-19 Revised:2025-04-15 Online:2025-09-30 Published:2025-09-16
  • Contact: Lu Zou E-mail:luzou0330@szpu.edu.cn
  • Supported by:
    This work is partly supported by the National Natural Science Foundation of China (No.72301076, No.12101149) and the Science and Technology Commission of Shanghai Municipality (No.23PJ1400800).

Abstract: Nested simulation encompasses the estimation of functionals linked to conditional expectations through simulation techniques. In this paper, we treat conditional expectation as a function of the multidimensional conditioning variable and provide asymptotic analyses of general nonparametric least squared estimators on sieve, without imposing specific assumptions on the function’s form. Our study explores scenarios in which the convergence rate surpasses that of the standard Monte Carlo method and the one recently proposed based on kernel ridge regression. We use kernel ridge regression with inducing points and neural networks as examples to illustrate our theorems. Numerical experiments are conducted to support our statements.

Key words: nested simulation, nonparametric estimation

CLC Number: