A Gentle Introduction to Optimization

By B. Guenin et al
Published on June 10, 2014
Read the original document by opening this link in a new tab.

Table of Contents

Contents
Preface
1 Introduction
1.1 A first example
1.1.1 The formulation
1.1.2 Correctness
1.2 Linear programs
1.2.1 Multiperiod models
1.3 Integer programs
1.3.1 Assignment problem
1.3.2 Knapsack problem
1.4 Optimization problems on graphs
1.4.1 Shortest path problem
1.4.2 Minimum cost perfect matching
1.5 Integer programs continued
1.5.1 Minimum cost perfect matching
1.5.2 Shortest path problem
1.6 Nonlinear programs
1.6.1 Pricing a tech gadget
1.6.2 Finding a closest point feasible in an LP
1.6.3 Finding a central feasible solution of an LP
1.7 Overview of the book
1.8 Further reading and notes
2 Solving linear programs
2.1 Possible outcomes
2.1.1 Infeasible linear programs
2.1.2 Unbounded linear programs
2.1.3 Linear programs with optimal solutions
2.2 Standard equality form
2.3 A simplex iteration
2.4 Bases and canonical forms
2.4.1 Bases
2.4.2 Canonical forms
2.5 The simplex algorithm
2.5.1 An example with an optimal solution
2.5.2 An unbounded example
2.5.3 Formalizing the procedure
2.6 Finding feasible solutions
2.6.1 General scheme
2.6.2 The two phase simplex algorithm-an example
2.6.3 Consequences
2.7 Simplex via tableaus*
2.7.1 Pivoting
2.7.2 Tableaus
2.8 Geometry
2.8.1 Feasible region of LPs and polyhedra
2.8.2 Convexity
2.8.3 Extreme points
2.8.4 Geometric interpretation of the simplex algorithm
2.9 Further reading and notes
3 Duality through examples
3.1 The shortest path problem
3.1.1 An intuitive lower bound
3.1.2 A general argument-weak duality
3.1.3 Revisiting the intuitive lower bound
3.1.4 An algorithm
3.1.5 Correctness of the algorithm
3.2 Minimum cost perfect matching in bipartite graphs
3.2.1 An intuitive lower bound
3.2.2 A general argument-weak duality
3.2.3 Revisiting the intuitive lower bound
3.2.4 An algorithm
3.2.5 Correctness of the algorithm
3.2.6 Finding perfect matchings in bipartite graphs*
3.3 Further reading and notes
4 Duality theory
4.1 Weak duality
4.2 Strong duality
4.3 A geometric characterization of optimality
4.3.1 Complementary slackness
4.3.2 Geometry
4.4 Farkas' lemma*
4.5 Further reading and notes
5 Applications of duality*
5.1 Approximation algorithm for set-cover
5.1.1 A primal-dual algorithm
5.1.2 Greed is good ... at least sometimes
5.1.3 Discussion
5.2 Economic interpretation
5.3 The maximum-flow-minimum-cut theorem
5.3.1 Totally unimodular matrices
5.3.2 Applications to st-flows
6 Solving integer programs
6.1 Integer programs versus linear programs
6.2 Cutting planes
6.2.1 Cutting planes and the simplex algorithm
6.3 Branch and bound
6.4 Traveling salesman problem and a separation algorithm*
6.5 Further reading and notes
7 Nonlinear optimization
7.1 Some examples
7.2 Some nonlinear programs are very hard
7.2.1 NLP versus 0,1 integer programming
7.2.2 Hard small-dimensional instances
7.3 Convexity
7.3.1 Convex functions and epigraphs
7.3.2 Level sets and feasible region
7.4 Relaxing convex NLPs
7.4.1 Subgradients
7.4.2 Supporting halfspaces
7.5 Optimality conditions for the differentiable case
7.5.1 Sufficient conditions for optimality
7.5.2 Differentiability and gradients
7.5.3 A Karush-Kuhn-Tucker theorem
7.6 Optimality conditions based on Lagrangians
7.7 Nonconvex optimization problems
7.7.1 The Karush-Kuhn-Tucker theorem for nonconvex problems
7.7.2 Convex relaxation approach to nonconvex problems*
7.8 Interior-point method for linear programs*
7.8.1 A polynomial-time interior-point algorithm*
7.9 Further reading and notes
Appendix A Computational complexity
A.1 What is a fast (resp. slow) algorithm?
A.1.1 The big O notation
A.1.2 Input size and running time
A.1.3 Polynomial and exponential algorithms
A.2 Examples of fast and slow algorithms
A.2.1 Linear programming
A.2.2 Other algorithms in this book
A.3 The classes NP , co-NP and P
A.3.1 Decision problems
A.3.2 The class NP
A.3.3 The class co-NP
A.3.4 The class P
A.4 Hard problems
A.4.1 Reducibility
A.4.2 NP-complete problems
A.4.3 Complexity as guide
References
Index

Summary

A Gentle Introduction to Optimization is a textbook designed to provide a modern and accessible route to learning fundamental ideas in optimization for undergraduate students with varying backgrounds and abilities. The book covers topics ranging from linear programs to nonconvex optimization problems, emphasizing the unifying notion of relaxation and the power of primal-dual approaches. The authors, B. Guenin, J. Könemann, and L. Tunçel, have 40 years of teaching experience and have thoroughly tested the material. Published in 2014 by Cambridge University Press, the book includes over 140 exercises with solutions for instructors and algorithms for computational problems. Self-contained chapters allow customization for different needs, making it suitable for self-study.
×
This is where the content will go.