Cubic Regularization Algorithms for Unconstrained and Constrained Optimization

Cubic Regularization Algorithms for Unconstrained and Constrained Optimization
Author: Ziyan Zhu
Publisher:
Total Pages: 0
Release: 2023
Genre:
ISBN:


Download Cubic Regularization Algorithms for Unconstrained and Constrained Optimization Book in PDF, Epub and Kindle

This dissertation focuses on cubic regularization methods for the globalization of Newton's method, with applications in unconstrained and constrained optimization. In recent years, cubic regularization algorithms have emerged as popular alternatives to trust-region and line-search methods for unconstrained optimization. The goal of this research is to tackle some of the challenges associated with cubic regularization and extend the methods to solve constrained problems. The first part of the dissertation is dedicated to enhancing the efficiency of cubic regularization methods for unconstrained optimization without sacrificing their favorable convergence properties. A nonmonotone adaptive cubic regularization approach is proposed that combines adaptive cubic regularization with a line search. In particular, a sufficient decrease in the objective function is obtained by performing a nonmonotone line-search based on satisfying certain strong Wolfe conditions. This is an alternative to repeatedly solving the cubic subproblem with varying regularization parameters and requires lower computational cost and fewer iterations. In addition, two hybrid algorithms are developed that substitute cubic regularization with Newton's method when the objective function is well-behaved or a sufficient descent direction is readily obtainable from a conventional Newton method. By judiciously determining when to utilize cubic regularization, the efficiency of Newton's method can be balanced against the robustness of cubic regularization. In the second part of the dissertation, a novel primal-dual interior-point method for general nonlinear constrained optimization is proposed that minimizes a sequence of shifted primal-dual penalty-barrier functions. The method combines cubic regularization and a line-search to ensure global convergence. The proposed method calculates the cubic regularized step by factoring a sequence of matrices with diagonally-modified primal-dual structure, enabling the use of off-the-shelf linear equation software. This approach allows the extension of the method to large-scale problems while maintaining computational efficiency. Finally, the performance of the proposed algorithms for both unconstrained and constrained optimization is illustrated by extensive numerical results obtained from problems in the CUTEst test collection.

Acceleration Methods

Acceleration Methods
Author: Alexandre d'Aspremont
Publisher:
Total Pages: 262
Release: 2021-12-15
Genre: Technology & Engineering
ISBN: 9781680839289


Download Acceleration Methods Book in PDF, Epub and Kindle

This monograph covers recent advances in a range of acceleration techniques frequently used in convex optimization. Using quadratic optimization problems, the authors introduce two key families of methods, namely momentum and nested optimization schemes. These methods are covered in detail and include Chebyshev Acceleration, Nonlinear Acceleration, Nesterov Acceleration, Proximal Acceleration and Catalysts and Restart Schemes.This book provides the reader with an in-depth description of the developments in Acceleration Methods since the early 2000s, whilst referring the reader back to underpinning earlier work for further understanding. This topic is important in the modern-day application of convex optimization techniques in many applicable areas.This book is an introduction to the topic that enables the reader to quickly understand the important principles and apply the techniques to their own research.

Variational Analysis

Variational Analysis
Author: R. Tyrrell Rockafellar
Publisher: Springer Science & Business Media
Total Pages: 747
Release: 2009-06-26
Genre: Mathematics
ISBN: 3642024319


Download Variational Analysis Book in PDF, Epub and Kindle

From its origins in the minimization of integral functionals, the notion of variations has evolved greatly in connection with applications in optimization, equilibrium, and control. This book develops a unified framework and provides a detailed exposition of variational geometry and subdifferential calculus in their current forms beyond classical and convex analysis. Also covered are set-convergence, set-valued mappings, epi-convergence, duality, and normal integrands.

Implicit Filtering

Implicit Filtering
Author: C. T. Kelley
Publisher: SIAM
Total Pages: 171
Release: 2011-09-29
Genre: Mathematics
ISBN: 1611971896


Download Implicit Filtering Book in PDF, Epub and Kindle

A description of the implicit filtering algorithm, its convergence theory and a new MATLAB® implementation.

Multi-Period Trading Via Convex Optimization

Multi-Period Trading Via Convex Optimization
Author: Stephen Boyd
Publisher:
Total Pages: 92
Release: 2017-07-28
Genre: Mathematics
ISBN: 9781680833287


Download Multi-Period Trading Via Convex Optimization Book in PDF, Epub and Kindle

This monograph collects in one place the basic definitions, a careful description of the model, and discussion of how convex optimization can be used in multi-period trading, all in a common notation and framework.

Mathematics for Machine Learning

Mathematics for Machine Learning
Author: Marc Peter Deisenroth
Publisher: Cambridge University Press
Total Pages: 392
Release: 2020-04-23
Genre: Computers
ISBN: 1108569323


Download Mathematics for Machine Learning Book in PDF, Epub and Kindle

The fundamental mathematical tools needed to understand machine learning include linear algebra, analytic geometry, matrix decompositions, vector calculus, optimization, probability and statistics. These topics are traditionally taught in disparate courses, making it hard for data science or computer science students, or professionals, to efficiently learn the mathematics. This self-contained textbook bridges the gap between mathematical and machine learning texts, introducing the mathematical concepts with a minimum of prerequisites. It uses these concepts to derive four central machine learning methods: linear regression, principal component analysis, Gaussian mixture models and support vector machines. For students and others with a mathematical background, these derivations provide a starting point to machine learning texts. For those learning the mathematics for the first time, the methods help build intuition and practical experience with applying mathematical concepts. Every chapter includes worked examples and exercises to test understanding. Programming tutorials are offered on the book's web site.

Numerical Algorithms

Numerical Algorithms
Author: Justin Solomon
Publisher: CRC Press
Total Pages: 400
Release: 2015-06-24
Genre: Computers
ISBN: 1482251892


Download Numerical Algorithms Book in PDF, Epub and Kindle

Numerical Algorithms: Methods for Computer Vision, Machine Learning, and Graphics presents a new approach to numerical analysis for modern computer scientists. Using examples from a broad base of computational tasks, including data processing, computational photography, and animation, the textbook introduces numerical modeling and algorithmic desig

First-Order Methods in Optimization

First-Order Methods in Optimization
Author: Amir Beck
Publisher: SIAM
Total Pages: 476
Release: 2017-10-02
Genre: Mathematics
ISBN: 1611974984


Download First-Order Methods in Optimization Book in PDF, Epub and Kindle

The primary goal of this book is to provide a self-contained, comprehensive study of the main ?rst-order methods that are frequently used in solving large-scale problems. First-order methods exploit information on values and gradients/subgradients (but not Hessians) of the functions composing the model under consideration. With the increase in the number of applications that can be modeled as large or even huge-scale optimization problems, there has been a revived interest in using simple methods that require low iteration cost as well as low memory storage. The author has gathered, reorganized, and synthesized (in a unified manner) many results that are currently scattered throughout the literature, many of which cannot be typically found in optimization books. First-Order Methods in Optimization offers comprehensive study of first-order methods with the theoretical foundations; provides plentiful examples and illustrations; emphasizes rates of convergence and complexity analysis of the main first-order methods used to solve large-scale problems; and covers both variables and functional decomposition methods.

Introductory Lectures on Convex Optimization

Introductory Lectures on Convex Optimization
Author: Y. Nesterov
Publisher: Springer Science & Business Media
Total Pages: 253
Release: 2013-12-01
Genre: Mathematics
ISBN: 144198853X


Download Introductory Lectures on Convex Optimization Book in PDF, Epub and Kindle

It was in the middle of the 1980s, when the seminal paper by Kar markar opened a new epoch in nonlinear optimization. The importance of this paper, containing a new polynomial-time algorithm for linear op timization problems, was not only in its complexity bound. At that time, the most surprising feature of this algorithm was that the theoretical pre diction of its high efficiency was supported by excellent computational results. This unusual fact dramatically changed the style and direc tions of the research in nonlinear optimization. Thereafter it became more and more common that the new methods were provided with a complexity analysis, which was considered a better justification of their efficiency than computational experiments. In a new rapidly develop ing field, which got the name "polynomial-time interior-point methods", such a justification was obligatory. Afteralmost fifteen years of intensive research, the main results of this development started to appear in monographs [12, 14, 16, 17, 18, 19]. Approximately at that time the author was asked to prepare a new course on nonlinear optimization for graduate students. The idea was to create a course which would reflect the new developments in the field. Actually, this was a major challenge. At the time only the theory of interior-point methods for linear optimization was polished enough to be explained to students. The general theory of self-concordant functions had appeared in print only once in the form of research monograph [12].