This white paper presents a Factor Augmented Sparse Throughput (FAST) model that utilizes both latent factors and sparse eigencomponents for nonparametric regression. A FAST model bridges a factor model on one end and a sparse nonparametric model on the other end. This includes structured nonparametric models such as factor augmented additive models and sparse low-dimensional nonparametric interaction models, and covers cases where the covariates do not allow factor structure. We use the truncated deep ReLU network for nonparametric factor regression without regularization and for the more general FAST model with nonconvex regularization via diversified prediction as an estimate of the latent factor space, and neural It provides factor augmented regression using networks (FAR-NN) and FAST-NN. each estimator. We show that FAR-NN and FAST-NN estimators adapt to unknown low-dimensional structures using hierarchical configuration models with non-asymptotic minimax rates. We also study statistical learning of factor-augmented sparse additive models using more specific neural network architectures. Our results are applicable to weakly dependent cases without factor structure. In proving the major technical achievements of FAST-NN, we establish new deep ReLU network approximation results that contribute to the foundation of neural network theory. Our theory and methods are further supported by simulation studies and applications to macroeconomic data.



    Source link

    Share.

    Leave A Reply