In order to reduce the complexity of traditional stochastic recursive gradient descent algorithm(SARAH),by using the inner loop number multiplication technique,a new accelerated algorithm called Epoch-Doubling-SARAH is proposed,and the convergence of the algorithm is studied under non-strongly convex.It is proved that the Epoch-Doubling-SARAH algorithm has a linear convergence rate and the complexity is O(1/ε+nlog(1/ε))by constructing Lyapunov function,this result is better than the complexity of traditional SARAH algorithm.Finally,the Epoch-Doubling-SARAH algorithm is compared with SARAH algorithm on Mnist and Mushroom data sets,experimental results show that Epoch-Doubling-SARAH algorithm has faster convergence and justify the correctness of theoretical analysis.