We examine augmented Lagrangians for optimization problems with a single (either inequality or equality) constraint. We establish some links between augmented Lagrangians and Lagrange-type functions and propose a new kind of Lagrange-type functions for a problem with a single inequality constraint. Finally, we discuss a supergradient algorithm for calculating optimal values of dual problems corresponding to some class of augmented Lagrangians.
In this article, we study the nonlinear penalization of a constrained optimization problem and show that the least exact penalty parameter of an equivalent parametric optimization problem can be diminished. We apply the theory of increasing positively homogeneous (IPH) functions so as to derive a simple formula for computing the least exact penalty parameter for the classical penalty function through perturbation function. We establish that various equivalent parametric reformulations of constrained optimization problems lead to reduction of exact penalty parameters. To construct a Lipschitz penalty function with a small exact penalty parameter for a Lipschitz programming problem, we make a transformation to the objective function by virtue of an increasing concave function. We present results of numerical experiments, which demonstrate that the Lipschitz penalty function with a small penalty parameter is more suitable for solving some nonconvex constrained problems than the classical penalty function.