2021 Virtual IOS Conference

2021 Virtual INFORMS Optimization Society Conference

Event Info

icon_calendar.jpgMarch 29-31, 2021
icon_clock.jpg11am-12noon EDT
icon_stopwatch.jpgDuration: 1 hour per day

Watch the replays below. 

Plenary Speakers

Daniel
Daniel Kuhn holds the Chair of Risk Analytics and Optimization at EPFL. Before joining EPFL, he was a faculty member at Imperial College London (2007–2013) and a postdoctoral researcher at Stanford University (2005–2006). He received a Ph.D. in Economics from the University of St. Gallen in 2004 and an M.Sc. in Theoretical Physics from ETH Zürich in 1999. His research interests revolve around robust optimization and stochastic programming.
Fatma

Fatma Kılınç-Karzan is the Frank A. and Helen E. Risch Faculty Development Chair and Associate Professor of Operations Research at Tepper School of Business, Carnegie Mellon University. She holds a courtesy appointment at the Department of Computer Science as well. She completed her Ph.D. at the Georgia Institute of Technology in 2011. Her research interests are on foundational theory and algorithms for convex optimization and structured nonconvex optimization, and their applications in optimization under uncertainty, machine learning, and business analytics. Her work was the recipient of several best paper awards, including the 2015 INFORMS Optimization Society Prize for Young Researchers and the 2014 INFORMS JFIG Best Paper Award. Her research has been supported by generous grants from NSF and ONR, including an NSF CAREER Award.

Samuel

Sam Burer is George Daly Professor in the Department of Business Analytics at the University of Iowa. He received his Ph.D. from the Georgia Institute of Technology, and his research focuses on convex optimization, especially semidefinite and copositive programming. He is the 2020 recipient of the INFORMS Computing Paper Prize, and his work has been supported by grants from the National Science Foundation, including the CAREER award. He currently serves as an area editor of Operations Research and as an associate editor for SIAM Journal on Optimization as well as Vice Chair of the SIAM Activity Group on Optimization.

Conference Schedule

Monday, March 29, 11am-12noon EDT
WATCH THE REPLAY HERE

A General Framework for Optimal Data-Driven Optimization

Speaker: Daniel Kuhn, EPFL

Abstract: We propose a statistically optimal approach to construct data-driven decisions for stochastic optimization problems. Fundamentally, a data-driven decision is simply a function that maps the available training data to a feasible action. It can always be expressed as the minimizer of a surrogate optimization model constructed from the data. The quality of a data-driven decision is measured by its out-of-sample risk. An additional quality measure is its out-of-sample disappointment, which we define as the probability that the out-of-sample risk exceeds the optimal value of the surrogate optimization model. The crux of data-driven optimization is that the data-generating probability measure is unknown. An ideal data-driven decision should therefore minimize the out-of-sample risk simultaneously with respect to every conceivable probability measure (and thus in particular with respect to the unknown true measure). Unfortunately, such ideal data-driven decisions are generally unavailable. This prompts us to seek data-driven decisions that minimize the out-of-sample risk subject to an upper bound on the out-of-sample disappointment - again simultaneously with respect to every conceivable probability measure. We prove that such Pareto-dominant data-driven decisions exist under conditions that allow for interesting applications: the unknown data-generating probability measure must belong to a parametric ambiguity set, and the corresponding parameters must admit a sufficient statistic that satisfies a large deviation principle. If these conditions hold, we can further prove that the surrogate optimization model generating the optimal data-driven decision must be a distributionally robust optimization problem constructed from the sufficient statistic and the rate function of its large deviation principle. This shows that the optimal method for mapping data to decisions is, in a rigorous statistical sense, to solve a distributionally robust optimization model. Maybe surprisingly, this result holds irrespective of whether the original stochastic optimization problem is convex or not and holds even when the training data is non-i.i.d. As a byproduct, our analysis reveals how the structural properties of the data-generating stochastic process impact the shape of the ambiguity set underlying the optimal distributionally robust optimization model.

This is joint work with Tobias Sutter and Bart Van Parys

Tuesday, March 30, 11am-12noon EDT 
WATCH THE REPLAY HERE

Exactness in SDP Relaxations of Quadratically Constrained Quadratic Programs

Speaker: Fatma Kılınç-Karzan, CMU

Abstract: Quadratically constrained quadratic programs (QCQPs) are a fundamental class of optimization problems. In a QCQP, we are asked to minimize a (possibly nonconvex) quadratic function subject to a number of (possibly nonconvex) quadratic constraints. Such problems arise naturally in many areas of operations research, computer science, and engineering. Although QCQPs are NP-hard to solve in general, they admit a natural family of tractable convex relaxations.  In this talk, we will study the standard semidefinite program (SDP) relaxation for general QCQPs and examine when this relaxation is exact. By analyzing the "geometry" of the SDP relaxation, we will give both sufficient conditions and necessary conditions for different types of "exactness" conditions, and discuss a number of implications of these results for various applications.


This is joint work with Alex L. Wang and C.J. Argue

Wednesday, March 31, 11am-12noon EDT
WATCH THE REPLAY HERE

Convexification for Non-Convex Mixed-Integer Quadratic Programming

Speaker: Sam Burer, University of Iowa

Abstract: Convexification is an important technique used for solving non-convex mixed-integer quadratic programs. We discuss three recent convexification results for nonconvex quadratic programming over: (i) bounded (x1,x2,x3) with x1*x2 = x3; (ii) continuous (x1,x2) and binary (y1,y2) such that (0,0) <= (x1,x2) <= (y1,y2); and (iii) a ball intersected with a second-order cone. Although these structures may seem quite specialized, they appear as critical substructures in numerous applications. In addition to describing these three results, we survey the landscape---and the current research frontier---of convexification techniques in this area.

This talk reflects joint work with Kurt Anstreicher, Anders Eltved, and Kyungchan Park.