This paper studies the output convergence of a class of recurrent neural networks with time-varying inputs. The model of the studied neural networks has different dynamic structure from that in the well known Hopfield model, it does not contain linear terms. Since different structures of differential equations usually result in quite different dynamic behaviors, the convergence of this model is quite different from that of Hopfield model. This class of neural networks has been found many successful applications in solving some optimization problems. Some sufficient conditions to guarantee output convergence of the networks are derived.