A weight smoothing algorithm is proposed in this paper to improve a neural network's generalization capability. The algorithm can be used when the data patterns to be classified are presented on an n-dimensional grid (n>/=1) and there exists some correlations among neighboring data points within a pattern. For a fully-interconnected feedforward net, no such correlation information is embedded into the architecture. Consequently, the correlations can only be extracted through sufficient amount of network training. With the proposed algorithm, a smoothing constraint is incorporated into the objective function of backpropagation to reflect the neighborhood correlations and to seek those solutions that have smooth connection weights. Experiments were performed on problems of waveform classification, multifont alphanumeric character recognition, and handwritten numeral recognition. The results indicate that (1) networks trained with the algorithm do have smooth connection weights, and (2) they generalize better.