Skip to main navigation Skip to search Skip to main content

An exact reformulation algorithm for large nonconvex NLPs involving bilinear terms

Research output: Contribution to journalArticlepeer-review

Abstract

Many nonconvex nonlinear programming (NLP) problems of practical interest involve bilinear terms and linear constraints, as well as, potentially, other convex and nonconvex terms and constraints. In such cases, it may be possible to augment the formulation with additional linear constraints (a subset of Reformulation-Linearization Technique constraints) which do not affect the feasible region of the original NLP but tighten that of its convex relaxation to the extent that some bilinear terms may be dropped from the problem formulation. We present an efficient graph-theoretical algorithm for effecting such exact reformulations of large, sparse NLPs. The global solution of the reformulated problem using spatial Branch-and Bound algorithms is usually significantly faster than that of the original NLP. We illustrate this point by applying our algorithm to a set of pooling and blending global optimization problems.

Original languageEnglish
Pages (from-to)161-189
Number of pages29
JournalJournal of Global Optimization
Volume36
Issue number2
DOIs
Publication statusPublished - 1 Jan 2006

Keywords

  • Bilinear
  • Convex relaxation
  • Global optimization
  • NLP
  • RRLT constraints
  • Reformulation- linearization technique

Fingerprint

Dive into the research topics of 'An exact reformulation algorithm for large nonconvex NLPs involving bilinear terms'. Together they form a unique fingerprint.

Cite this