Preprint A777/2016
Variance-based stochastic extragradient methods with linear search for stochastic variational inequalities
Alfredo Noel Iusem | Alejandro Jofré | Roberto Imbuzeiro de Oliveira | Philip Thompson
Keywords: stochastic variational inequalities | extragradient method | linear search

We propose stochastic extragradient methods for stochastic variational inequalities with a linear search, requiring only pseudo-montonicity of the operator and no knowledge of the Lipschitz constant L. We provide convergence and complexity analysis, allowing for an unbounded feasible set, unbounded operator and non-uniform variance of the oracle, without requiring any regularization. We also prove that the generated sequence is bounded in L^p. Alongside the stochastic approximation procedure, we iteratively reduce the variance of the stochastic error. Our methods cope with stepsizes bounded away from zero, and attain a near optimal oracle complexity and an accelerated rate O(1/K) in terms of the mean (quadratic) natural residue and the mean D-gap function, where K is the number of iterations required for a given tolerance e>0.  Explicit estimates for the convergence rate, oracle complexity and the p-moments are given, depending on problem parameters and the distance between the initial iterate and the solution set. 


Anexos: