In this brief, based on the method of penalty functions, a recurrent neural network (NN) modeled by means of a differential inclusion is proposed for solving the bilevel linear programming problem (BLPP). Compared with the existing NNs for BLPP, the model has the least number of state variables and simple structure. Using nonsmooth analysis, the theory of differential inclusions, and Lyapunov-like method, the equilibrium point sequence of the proposed NNs can approximately converge to an optimal solution of BLPP under certain conditions. Finally, the numerical simulations of a supply chain distribution model have shown excellent performance of the proposed recurrent NNs.
The main objective of this paper is to provide new explicit criteria to characterize weak lower semicontinuous Lyapunov pairs or functions associated to first-order differential inclusions in Hilbert spaces. These inclusions are governed by a Lipschitzian perturbation of a maximally monotone operator. The dual criteria we give are expressed by means of the proximal and basic subdifferentials of the nominal functions while primal conditions are described in terms of the contingent directional derivative. We also propose a unifying review of many other criteria given in the literature. Our approach is based on advanced tools of variational analysis and generalized differentiation.