We propose a projection-free conditional gradient-based algorithm for smooth stochastic multilevel synthesis optimization where the objective function is a nested composition of $T$ functions and the constraint set is a closed convex set. Our algorithm assumes access to noisy estimates of functions and their gradients through a probabilistic first-order oracle that satisfies certain standard unbiased and second-moment assumptions. $\epsilon$-The number of calls to the probabilistic first-order oracle and the linear minimization oracle required by the proposed algorithm to obtain a stationary solution is $\mathcal{O}_T(\epsilon $\mathcal{O }_T$ hides the constants of $T$ In particular, these complexity bounded dependencies on $\epsilon$ and $T$ are such that changing one does not affect bounded dependencies on the other. Furthermore, our algorithm is parameterless and, unlike the common practice in the analysis of stochastic conditional gradient algorithms, requires an (increasing) order of mini-batches to converge. does not require

    Source link


    Leave A Reply