In this paper we present a variant of the proximal
forward-backward splitting method for solving nonsmooth optimization
problems in Hilbert spaces, when the objective function is the sum
of two nondifferentiable convex functions. The proposed iteration,
which will be call the Proximal Subgradient Splitting Method,
extends the classical projected subgradient iteration for important
classes of problems, exploiting the additive structure of the
objective function. The weak convergence of the generated sequence
was established using different stepsizes and under suitable
assumptions. Moreover, we analyze the complexity of the iterates.