In this work, we interpret real symmetric eigenvalue problems in an unconstrained global optimization framework. More precisely, given two N×N matrices, a symmetric matrix A, and a symmetric positive definite matrix B, we propose and analyze a nonconvex functional F whose local minimizers are, indeed, global minimizers. These minimizers correspond to eigenvectors of the generalized eigenvalue problem Ax=λBx associated with its smallest eigenvalue. To minimize the proposed functional F, we consider the gradient descent method and show its global convergence. Furthermore, we provide explicit error estimates for eigenvalues and eigenvectors at the k th iteration of the method in terms of the gradient of F at the k th iterate x k . At the end, we provide a few numerical experiments to confirm our analysis and to compare with other methods, which reveals interesting numerical aspects of our proposed model.