Reconstructing a vector $v$ seeded in an $n$-dimensional random subspace of $\mathbb{R}^N$ is a useful tool in machine learning and statistics, such as dictionary learning, subspace reconstruction, and principal components. A common task associated with many issues. analysis, and non-Gaussian component analysis. In this work, we study computationally efficient estimation and detection of Gaussian vectors with the same $\ell_2$ norm and seeded vectors $v$ with different $\ell_4$ norms. For example, in the special case where $v$ is a $N \rho$ sparse vector with Bernoulli Gaussian or Bernoulli Rademacher entries, the results include
(1) provide an improved analysis of a slight variant of the spectral method proposed by Hopkins, Schramm, Shi, and Steurer (2016), which almost recovers $v$ with high probability in the regime $n \rho \ll; Indicates that \sqrt{N}$. This condition is the condition required in the previous work up to the polylogarithm factor $\rho \ll 1/\sqrt{n}$ or $n \sqrt{\rho} \lesssim \sqrt{ Contains N}$. A leave-one-out analysis achieves $\ell_\infty$ error bounds on the spectral estimator. From this we can conclude that a simple thresholding procedure yields the dense case $\row = 1$.
(2) We studied the related detection problem and found that in the regime $n \rho \gg \sqrt{N}$ spectral methods from large classes (more generally, low-order polynomials of the input) fail. indicate. Detects planted vectors. This meets the recovery criteria and provides evidence that for $n \rho \gg \sqrt{N}$ there is no polynomial-time algorithm that successfully recovers the Bernoulli-Gaussian vector $v$.