In this paper a new Gradient Descent based Genetic Algorithm (GDGA) is proposed and used to solve benchmark Nonlinear Least Squares (NLS) problems. GDGA uses theoretically calculated gradient to perform a gradient descent around the best solution found by the Genetic Algorithm (GA). This approach employs the GA to escape from local minima and estimate a solution in the neighborhood of the global minima. Once an approximation to the global minimum is found, gradient descent is done with the solution found by the GA as the starting point. Stochastic search algorithms like GA can easily compute a solution in the vicinity of the global minimum but take a long time to converge to the exact minimum due to the random nature of the search. Thus GDGA synergistically combines the advantages of deterministic local search and heuristic random global search to computes an accurate solution efficiently. Simulation results indicate that GDGA performs well on benchmark NLS problems. © 2006-2016 Asian Research Publishing Network (ARPN).