In this paper a new algorithm is developed to minimize linearly constrained non-smooth optimization problem for convex objective functions. The algorithm is based on the concept of codifferential. The convergence of the proposed minimization algorithm is proved and results of numerical experiments using a set of test problems with nonsmooth convex objective function are reported.
A new algorithm is developed based on the concept of codifferential for minimizing the difference of convex nonsmooth functions. Since the computation of the whole codifferential is not always possible, we use a fixed number of elements from the codifferential to compute the search directions. The convergence of the proposed algorithm is proved. The efficiency of the algorithm is demonstrated by comparing it with the subgradient, the truncated codifferential and the proximal bundle methods using nonsmooth optimization test problems.