Skip to content

90. Operations Research, Mathematical Programming

This volume studies decision-making under constraints using mathematical models.

This volume studies decision-making under constraints using mathematical models. It integrates optimization, stochastic models, and algorithmic methods.

Part I. Foundations

Chapter 1. Optimization Models

1.1 Decision variables 1.2 Objective functions 1.3 Constraints 1.4 Feasible regions 1.5 Examples

Chapter 2. Linear Programming

2.1 Standard form 2.2 Feasible solutions 2.3 Geometry of solutions 2.4 Applications 2.5 Examples

Chapter 3. Duality

3.1 Dual problems 3.2 Weak and strong duality 3.3 Complementary slackness 3.4 Economic interpretation 3.5 Examples

Part II. Algorithms for Linear Programming

Chapter 4. Simplex Method

4.1 Basic feasible solutions 4.2 Pivot operations 4.3 Degeneracy 4.4 Applications 4.5 Examples

Chapter 5. Interior-Point Methods

5.1 Barrier methods 5.2 Path-following algorithms 5.3 Convergence 5.4 Applications 5.5 Examples

Chapter 6. Large-Scale Optimization

6.1 Sparse systems 6.2 Decomposition methods 6.3 Column generation 6.4 Applications 6.5 Examples

Part III. Integer and Combinatorial Optimization

Chapter 7. Integer Programming

7.1 Formulations 7.2 Branch-and-bound 7.3 Cutting planes 7.4 Applications 7.5 Examples

Chapter 8. Combinatorial Optimization

8.1 Graph models 8.2 Shortest paths 8.3 Matching 8.4 Applications 8.5 Examples

Chapter 9. Network Flow Problems

9.1 Flow models 9.2 Max-flow min-cut theorem 9.3 Minimum cost flow 9.4 Applications 9.5 Examples

Part IV. Nonlinear Programming

Chapter 10. Nonlinear Optimization

10.1 Problem formulation 10.2 Optimality conditions 10.3 Local vs global optima 10.4 Applications 10.5 Examples

Chapter 11. Convex Optimization

11.1 Convex problems 11.2 Duality 11.3 Algorithms 11.4 Applications 11.5 Examples

Chapter 12. Heuristics and Metaheuristics

12.1 Greedy methods 12.2 Simulated annealing 12.3 Genetic algorithms 12.4 Applications 12.5 Examples

Part V. Stochastic Models

Chapter 13. Queueing Theory

13.1 Basic models 13.2 Birth-death processes 13.3 Performance measures 13.4 Applications 13.5 Examples

Chapter 14. Inventory Models

14.1 Deterministic models 14.2 Stochastic models 14.3 Optimization policies 14.4 Applications 14.5 Examples

Chapter 15. Markov Decision Processes

15.1 States and actions 15.2 Transition probabilities 15.3 Bellman equations 15.4 Applications 15.5 Examples

Part VI. Game Theory and Decision Analysis

Chapter 16. Game Theory

16.1 Strategic games 16.2 Nash equilibrium 16.3 Cooperative games 16.4 Applications 16.5 Examples

Chapter 17. Decision Theory

17.1 Utility functions 17.2 Risk analysis 17.3 Multi-criteria decision making 17.4 Applications 17.5 Examples

Chapter 18. Robust Optimization

18.1 Uncertainty models 18.2 Worst-case optimization 18.3 Applications 18.4 Examples 18.5 Connections

Part VII. Applications

Chapter 19. Logistics and Supply Chains

19.1 Transportation problems 19.2 Scheduling 19.3 Routing 19.4 Applications 19.5 Examples

Chapter 20. Finance and Economics

20.1 Portfolio optimization 20.2 Risk management 20.3 Pricing models 20.4 Applications 20.5 Examples

Chapter 21. Engineering Systems

21.1 Resource allocation 21.2 Control systems 21.3 Network design 21.4 Applications 21.5 Examples

Part VIII. Research Directions

Chapter 22. Advanced Topics

22.1 Large-scale optimization 22.2 Data-driven optimization 22.3 Online algorithms 22.4 Modern developments 22.5 Emerging areas

Chapter 23. Open Problems

23.1 Complexity limits 23.2 Approximation algorithms 23.3 Uncertainty modeling 23.4 Computational challenges 23.5 Future directions

Chapter 24. Historical and Conceptual Notes

24.1 Development of operations research 24.2 Key contributors 24.3 Evolution of mathematical programming 24.4 Cross-disciplinary impact 24.5 Summary

Appendix

A. Optimization method summary B. Algorithm templates C. Proof techniques checklist D. Model formulation examples E. Cross-reference to other MSC branches