In this paper we focus on the convergence analysis of the
proximal forward-backward splitting method for solving nonsmooth
optimization problems in Hilbert spaces when the objective function
is the sum of two convex functions. Assuming that one of the
functions is Fréchet differentiable and using two new
linesearches, the weak convergence is established without any
Lipschitz continuity assumption on the gradient. Furthermore, we obtain many complexity results of cost values at the
iterates when the stepsizes are bounded below by a positive
constant. A fast version with linesearch is also provided.