Two-timescale recurrent neural networks for distributed minimax optimization

Neural Netw. 2023 Aug:165:527-539. doi: 10.1016/j.neunet.2023.06.003. Epub 2023 Jun 9.

Abstract

In this paper, we present two-timescale neurodynamic optimization approaches to distributed minimax optimization. We propose four multilayer recurrent neural networks for solving four different types of generally nonlinear convex-concave minimax problems subject to linear equality and nonlinear inequality constraints. We derive sufficient conditions to guarantee the stability and optimality of the neural networks. We demonstrate the viability and efficiency of the proposed neural networks in two specific paradigms for Nash-equilibrium seeking in a zero-sum game and distributed constrained nonlinear optimization.

Keywords: Distributed optimization; Minimax optimization; Neurodynamic optimization; Recurrent neural networks.

MeSH terms

  • Algorithms*
  • Computer Simulation
  • Neural Networks, Computer*