## Convex Analysis and Nonlinear Optimization: Theory and ExamplesOptimization is a rich and thriving mathematical discipline. The theory underlying current computational optimization techniques grows ever more sophisticated. The powerful and elegant language of convex analysis unifies much of this theory. The aim of this book is to provide a concise, accessible account of convex analysis and its applications and extensions, for a broad audience. It can serve as a teaching text, at roughly the level of first year graduate students. While the main body of the text is self-contained, each section concludes with an often extensive set of optional exercises. The new edition adds material on semismooth optimization, as well as several new proofs that will make this book even more self-contained. |

### From inside the book

Results 1-5 of 23

When the directional derivative f'(

**x**; d) is actually linear in d (that is, f'(t; d) = (a,d) for some element a of E) then we say f is (Gâteaua) differentiable at 3, with (Gâteaua) derivative

**Vf**(

**x**) = a. If f is differentiable at every ...

D The case of this result where C is an open set is the canonical introduction to the use of calculus in optimization: local minimizers ŕ must be critical points (that is,

**Vf**(

**x**) = 0). This book is largely devoted to the study of first ...

The element ye Y satisfying

**Vf**(

**x**) = A'y in the above result is called a Lagrange multiplier. This kind of construction recurs in many different forms in our development. In the absence of convexity, we need second order information to ...

If the vector d =

**Vf**(

**x***) satisfies ||d|| > 6 then, from the inequality +--vie),4)--a <-la. we would have for small t > 0 the contradiction —te|d|| > f(

**x**" – td) – f(

**x**") = (f(

**x**" – td) + e|

**x**" – td|) – (f(

**x**") + e|

**x**"|) + e(|

**x**"| – ||a" – td|) > ...

(Coercivity) Suppose that the function f : E → R is differentiable and satisfies the growth condition limir .22 f(

**x**)/|

**x**|= +co. Prove that the gradient map

**Vf**has range E. (Hint: Minimize the function f() — (a, -) for elements a of E.) ...

### What people are saying - Write a review

### Contents

15 | |

Fenchel Duality | 33 |

Convex Analysis | 65 |

Special Cases | 97 |

Nonsmooth Optimization | 123 |

KarushKuhnTucker Theory | 153 |

Fixed Points | 179 |

Infinite Versus Finite Dimensions | 209 |

List of Results and Notation | 221 |

Bibliography | 241 |

Index | 253 |

### Other editions - View all

Convex Analysis and Nonlinear Optimization: Theory and Examples Jonathan M. Borwein,Adrian S. Lewis No preview available - 2000 |