"볼록 최적화"의 두 판 사이의 차이

수학노트
둘러보기로 가기 검색하러 가기
(→‎노트: 새 문단)
 
17번째 줄: 17번째 줄:
 
* Extensions of convex optimization include the optimization of biconvex, pseudo-convex, and quasiconvex functions.<ref name="ref_8dfd" />
 
* Extensions of convex optimization include the optimization of biconvex, pseudo-convex, and quasiconvex functions.<ref name="ref_8dfd" />
 
* This monograph presents the main complexity theorems in convex optimization and their corresponding algorithms.<ref name="ref_99d9">[https://www.nowpublishers.com/article/Details/MAL-050 Convex Optimization: Algorithms and Complexity]</ref>
 
* This monograph presents the main complexity theorems in convex optimization and their corresponding algorithms.<ref name="ref_99d9">[https://www.nowpublishers.com/article/Details/MAL-050 Convex Optimization: Algorithms and Complexity]</ref>
* In this paper we lay the foundation of robust convex optimization.<ref name="ref_be7a">[https://pubsonline.informs.org/doi/10.1287/moor.23.4.769 Robust Convex Optimization]</ref>
 
 
* A wealth of existing methodology for convex optimization can then be used to identify points arbitrarily close to the true global optimum.<ref name="ref_b80b">[https://www.osapublishing.org/abstract.cfm?uri=oe-26-19-24422 Using convex optimization of autocorrelation with constrained support and windowing for improved phase retrieval accuracy]</ref>
 
* A wealth of existing methodology for convex optimization can then be used to identify points arbitrarily close to the true global optimum.<ref name="ref_b80b">[https://www.osapublishing.org/abstract.cfm?uri=oe-26-19-24422 Using convex optimization of autocorrelation with constrained support and windowing for improved phase retrieval accuracy]</ref>
* We name our approach Convex Optimization of Autocorrelation with Constrained Support (COACS).<ref name="ref_b80b" />
 
 
===소스===
 
===소스===
 
  <references />
 
  <references />

2020년 12월 15일 (화) 22:29 판

노트

  • CVXOPT is a free software package for convex optimization based on the Python programming language.[1]
  • Convex optimization studies the problem of minimizing a convex function over a convex set.[2]
  • Surprisingly, algorithms for convex optimization have also been used to design counting problems over discrete objects such as matroids.[2]
  • Simultaneously, algorithms for convex optimization have become central to many modern machine learning applications.[2]
  • The goal of this book is to enable a reader to gain an in depth understanding of algorithms for convex optimization.[2]
  • Convex optimization has many applications ranging from operations research and machine learning to quantum information theory.[3]
  • The goal of this course is to investigate in-depth and to develop expert knowledge in the theory and algorithms for convex optimization.[4]
  • The lecture slides are adopted from Dr. Stephen Boyd's letcture notes on Convex Optimization at Standord University.[5]
  • x + &bgr; y ) = &agr; f i( x ) + &bgr; f i( y )), the problem is said to be one of convex optimization.[6]
  • Note that linear programming is a special case of convex optimization, where the objective and constraint functions are all linear.[6]
  • If a problem can be transformed to an equivalent convex optimization, then ability to visualize its geometry is acquired.[7]
  • Study of equivalence, sameness, and uniqueness therefore pervade study of convex optimization.[7]
  • A MOOC on convex optimization, CVX101, was run from 1/21/14 to 3/14/14.[8]
  • Convex optimization is a subfield of mathematical optimization that studies the problem of minimizing convex functions over convex sets.[9]
  • Extensions of convex optimization include the optimization of biconvex, pseudo-convex, and quasiconvex functions.[9]
  • This monograph presents the main complexity theorems in convex optimization and their corresponding algorithms.[10]
  • A wealth of existing methodology for convex optimization can then be used to identify points arbitrarily close to the true global optimum.[11]

소스